Entity Framework Core Migration – FluentAPI – Extraneous DB Columns
I’m having some trouble with Entity Framework Core. I am converting a project from database first to code first. The project is very old, so it also needs to be upgraded from .NET v4.8 to .NET 8 and it is moving from EF 6 to EF Core.
Entity Framework Core Migration – FluentAPI – Extraneous DB Columns
I’m having some trouble with Entity Framework Core. I am converting a project from database first to code first. The project is very old, so it also needs to be upgraded from .NET v4.8 to .NET 8 and it is moving from EF 6 to EF Core.
EF Core receives result from database without values from database trigger
In a table, I have implemented a database trigger to populate a column with a calculated value (column is named ProjectCode
). I use Web API controllers to create and retrieve entries from the database. An instance of my DbContext
is injected into the controller.
Entity property is set prior to SaveChangesAsync then sometimes is saved to the database as null, and changed to null in the entity
I have an intermittent problem. I have a C# API project that reads an entity from the database, updates its properties, then calls DbContext.SaveChangesAsync
. The entity has a nullable int DepartmentId
property. Sometimes the DepartmentId
column in the database table is updated correctly. However, sometimes the DepartmentId
column is set to NULL
(after previously being populated), while the other columns appear to be updated correctly.
Reading data from SQL Server with EF AsSplitQuery causes data to disappear (if writing simultaneously)
We have a .NET 7 application, that uses SQL Server and EF 7.
We have a memory cache that we read a large part of the database into (An “Item”-table with ~20 Includes), and refresh at regular intervals.
Because of the many includes, we need to use .UseSplitQuery() or the cartesian explosion would eat up all memory, causing the query to eventually fail.
This setup has worked fine for many years.
Update many rows with unique value per row minimizing round trips
I need to update many (read 5000+) records with a different new value per row. I am leaning towards option 2 but wanted to ask to community.