how to lock db register with sqlalchemy orm after add
I am working with SQLalchemy orm. I add a register using session.add(object)
and then I commit it with session.commit()
.
After commit I continue working on the orm object so I need to lock it so other processes can’t edit the object. I need the same behaviour as session.query.with_for_update
. Is there a way to do it?
How to group by and aggregate Int field and array[int] fields?
So, lets say I’ve got some table:
SqlAlchemy saving an object returns error null value in colum even though the value is not null
I am saving an object in SQl Alchemy using:
SqlAlchemy create_all() doesn’t create tables
I’m trying to create a set of tables using SQL Alchemy 2.0
SQLAlchemy is 10x slower than PostgreSQL for a simple query
The query select * from test limit 1;
takes about 0.4 ms
in psql
using timing on
, but the same query takes at least 4 ms
in SQLAlchemy. Is it normal? I am using the following code to measure the execution time:
SQLAlchemy: I can’t execute query with two tables
My problem next: I can’t execute query, where need to get list of Accounts
with their Records
by account_id
as sum of Records
.
SQLAlchemy and PostgreSQL same timestamp after update
In the following code after 5 seconds sleep I expect the second part of date_updated
to be changed, but only the millisecond part is changed. If I use database_url = 'sqlite:///:memory:'
it works as expected. Why?
why is my database insert using sqlalchemy for postgresql with an Identity column so slow?
I am using sqlalchemy with a postgresql database hosted on AWS. I have one particular table that is extremely slow. Other tables load in seconds. This particular table can take 10 hours to load ~400 records. Most of the other tables I use session.bulk_insert_mappings. It seems like I could not use it on this table because of the Identity column but I could be wrong. The following is a generic version of what I have and how I am doing the insert. The table that has the problem is Output. I also have a local database that does not seem to have the problem. The database on AWS has to go through a proxy to get there. Any ideas on how to speed it up?
How to use jsonb_to_record and x() with sqlalchemy
I am trying to make the following SQL alchemy orm query work :
Usage PostgreSQL Multirange function with SQLAlchemy
I am using NUMMULTIRANGE field in my model. PostgreSQL has functions for this type described in docs. But I can’t find implementation of this functionality in SQLAlchemy. I’ve read this part of documentation. There are a lot of useful properties for Range type, but nothing for Multirange.