24

I have a small (~10 rows) table called restrictions in my PostgreSQL database, where values are deleted and inserted on a daily basis.

I would like to have a table called restrictions_deleted, where every row that is deleted from restrictions will be stored automatically. Since restrictions has a serial id, there will be no duplicates.

How do I write such a trigger in PostgreSQL?

Erwin Brandstetter
  • 175,982
  • 27
  • 439
  • 600
Adam Matan
  • 11,659
  • 29
  • 80
  • 95

2 Answers2

19

You just need to move the old data into the restrictions_deleted table before it gets deleted. This is done with the OLD data type. You can use a regulat INSERT statement and and use the OLD values as the values-to-be-inserted.

CREATE TRIGGER moveDeleted
BEFORE DELETE ON restrictions 
FOR EACH ROW
EXECUTE PROCEDURE moveDeleted();


CREATE FUNCTION moveDeleted() RETURNS trigger AS $$
    BEGIN
       INSERT INTO restrictions_deleted VALUES(OLD.column1, OLD.column2,...);
       RETURN OLD;
    END;
$$ LANGUAGE plpgsql;
DrColossos
  • 7,187
  • 2
  • 32
  • 30
  • 2
    You can replace the explicit list of column values with something like VALUES((OLD).*) – KayEss Apr 16 '15 at 04:26
  • 2
    Works fine, but the create function needs to be called before create trigger. And the VALUES((OLD).*) trick suggested by KayEss is nice. – mivk Apr 02 '16 at 14:14
8

If you are open to a different approach, Have you considered adding a 'deleted' Boolean flag to the table, or a 'deleted_at' timestamp instead.

Or better still, deny CRUD access to your database tables and handle the audit trail in your transactional API :)

Jack Douglas
  • 39,869
  • 15
  • 101
  • 176
  • This will cause performance issues when deleted record count is too large (millions) – echo May 02 '22 at 02:35