且构网

分享程序员开发的那些事...
且构网 - 分享程序员编程开发的那些事

是否可以使用Dapper传输大型SQL Server数据库结果集?

更新时间:2023-01-29 19:01:27


使用Dapper的 Query< T> 方法,该方法会将整个结果集扔到内存中


这是一项很好的工作,然后,一个可选参数是 bool ,它使您可以选择是否缓冲; p



只需将,缓冲区:false 添加到您现有的 Query< T>调用中,


I have about 500K rows I need to return from my database (please don't ask why).

I will then need to save these results as XML (more URGH) and the ftp this file to somewhere magical.

I also need to transform the each row in the result set.

Right now, this is what I'm doing with say .. TOP 100 results:

  • using Dapper's Query<T> method, which throws the entire result set into memory
  • I then use AutoMapper to convert the database POCO to my FileResult POCO
  • Convert to XML
  • Then save this collection to the file system
  • Then FTP

This works fine for 100 rows, but I get an Out Of Memory exception with AutoMapper when trying to convert the 500K results to a new collection.

So, I was wondering if I could do this...

  • Stream data from DB using Dapper
  • For each row, automapper it
  • Convert to XML
  • Stream result to disk
  • <repeat for each row>
  • Now ftp that file to magic-land

I'm trying to stop throwing everything into RAM. My thinking is that if I can stream stuff, it's more memory efficient as I only work on a single result set of data.

using Dapper's Query<T> method, which throws the entire result set into memory

It is a good job, then, that one of the optional parameters is a bool that lets you choose whether to buffer or not ;p

Just add , buffer: false to your existing call to Query<T>.