Thursday, 26 February 2026

Fixing Duplicate Foreign Key Issue in EF / EF Core

Fixing Duplicate Foreign Key Issue in EF / EF Core

Sometimes when defining relationships in Entity Framework Core, the framework creates the same foreign key column twice — for example MasterId and a shadow Master_Id column — because the relationship is ambiguous.

In this article, I’ll show why this happens and how to explicitly configure navigation properties so only one foreign key is created.

Problem Code Example

public class Master
{
    public int Id { get; set; }
    public virtual ICollection<Child> Childs { get; set; }
}

public class Child
{
    public int Id { get; set; }
    public int MasterId { get; set; }
    public Master Master { get; set; }
}

If EF Core isn’t sure which property is the foreign key, it may generate two columns in the database. To fix that, we need to define the relationship explicitly.

Solution 1 — Using Data Annotation

using System.ComponentModel.DataAnnotations.Schema;

public class Child
{
    public int Id { get; set; }
    public int MasterId { get; set; }

    [ForeignKey(nameof(MasterId))]
    public Master Master { get; set; }
}

This tells EF Core that the MasterId field is definitely the foreign key for the Master navigation property.

Solution 2 — Using Fluent API

In your DbContext class:

protected override void OnModelCreating(ModelBuilder modelBuilder)
{
    modelBuilder.Entity<Child>()
        .HasOne(c => c.Master)
        .WithMany(m => m.Childs)
        .HasForeignKey(c => c.MasterId);
}

This forces EF Core to map the relationship exactly the way you intend.

Key Takeaways

  • Avoid relying solely on conventions when you have both FK fields and navigation properties.
  • Use [ForeignKey] or Fluent API to map relationships explicitly.

If you follow these steps, EF Core will generate only one foreign key column in the database.

Posted by Arulkumar Sivaraj

Monday, 9 February 2026

Rebuilding Data Base Table Indexes to Reduce Fragmentation

Rebuilding Audit Table Indexes to Reduce Fragmentation

Audit tables usually grow very fast due to continuous insert operations. Over time, this can lead to index fragmentation, which negatively impacts query performance and increases CPU usage.

In this post, we will review the index maintenance performed on the [dbo].[xyztablename] table, including index rebuild, reorganize evaluation, and statistics update.

Why Index Maintenance Is Important

  • Audit tables receive high-volume inserts
  • Fragmentation increases over time
  • Queries become slower if indexes are not maintained
  • Regular maintenance improves overall database health

Step 1: Rebuild Index for Heavy Fragmentation

When index fragmentation exceeds recommended limits (generally above 30%), an INDEX REBUILD is the most effective solution.

ALTER INDEX ALL ON [dbo].[xyztablename] REBUILD;
  

Execution Details:

  • Start Time: 09-Feb-2026 12:53 PM
  • Duration: 23 minutes 19 seconds

Index rebuild removes fragmentation completely and also updates statistics automatically. However, it is resource-intensive and should be scheduled during low-traffic hours.

Step 2: Index Reorganize (Not Required)

Index reorganize is recommended only when fragmentation is moderate (between 5% and 30%). In this case, reorganize was not required.

ALTER INDEX ALL ON [dbo].[xyztablename] REORGANIZE;
  

Since the index rebuild already addressed the fragmentation, running reorganize would have been redundant.

Step 3: Update Statistics with FULLSCAN

To ensure SQL Server generates optimal execution plans, statistics were updated using a full data scan.

UPDATE STATISTICS [dbo].[xyztablename] WITH FULLSCAN;
  

Execution Details:

  • Start Time: 09-Feb-2026 2:15 PM
  • Scan Type: FULLSCAN

Maintenance Summary

Task Status
Index Rebuild Completed
Index Reorganize Not Required
Statistics Update FULLSCAN Applied

Best Practices

  • Monitor index fragmentation regularly
  • Use REBUILD for high fragmentation
  • Use REORGANIZE only when needed
  • Keep statistics updated for large tables
  • Schedule maintenance during non-peak hours

Conclusion

Proper index and statistics maintenance ensures audit tables remain performant even as data grows. A well-planned maintenance window can prevent long-term performance issues and unexpected outages.