标签云

微信群

扫码加入我们

WeChat QR Code


They didn't happen to put a last-modified timestamp on their rows, did they?

2018年12月11日02分33秒

Checksums can and will fail eventually. If your system accepts that two different sets of data will result in the same checksum, than you are fine. For that reason, I had to move away from checksums in most of our systems...

2018年12月11日02分33秒

LPains can you please elaborate on your statement?

2018年12月10日02分33秒

petrosmm I'm not sure what specifically you want me to elaborate, but I will try. Imagine you have a table with a few hundred records, you essentially generate an integer as a checksum, how often is that going to collide? In my case, I was doing that with about 10 tables, all with hundreds of records. I had at least one collision per day. Check this other answer stackoverflow.com/questions/14450415/…

2018年12月11日02分33秒

Thank you LPains!

2018年12月11日02分33秒

That's not actually an answer, it's a "your suggestion doesn't work".

2018年12月11日02分33秒

This can be remedied for duplicated data by using the DISINCT keyword before the BINARY_CHECKSUM. There are a few other pitfalls discussed here but not exactly common scenarios.

2018年12月11日02分33秒

The question is about changes in table data and information_schema contains schema (column definitions) of the table.

2018年12月11日02分33秒

Please provide a documented way of getting this information scoped to a table in SQL Server

2018年12月11日02分33秒