damaged files
-
- Posts: 9
- Joined: Wed Mar 08, 2006 10:21 pm
- Sign-up code: 0
damaged files
Hello
as known by many krppkr.2.nbw emd sold by chessbase two years ago is damaged ; sorry for those who dowloaded it since 4th March
error messages too for the following files ( = damaged in Wilhelm ):
krppkq.5.nbb krppkq.3.nbb and krppkq.6.nbb
as known by many krppkr.2.nbw emd sold by chessbase two years ago is damaged ; sorry for those who dowloaded it since 4th March
error messages too for the following files ( = damaged in Wilhelm ):
krppkq.5.nbb krppkq.3.nbb and krppkq.6.nbb
Best wishes to all
-
- Posts: 489
- Joined: Sat Jan 21, 2006 10:43 am
- Sign-up code: 10159
- Location: Reading, UK
- Contact:
Please clarify ...
What do you mean by "error messages for these files ... damaged in WILHELM"? There are no EGT-files in WILHELM - only MD5sums.
Are you saying that the cited MD5sums do not match your files.
To check whether the files are damaged and/or the MD5sums erronous, run datacomp t filename, as see if you get the "CRC32 ..... ON" message. If you do, I believe the file is ok.
The best 'EGT-file assurance' procedure I know is to run:
fsum -md5 filename
datacomp t filename
to produce an MD5sum, and then to show that the file you produced it from was ok.
g
Are you saying that the cited MD5sums do not match your files.
To check whether the files are damaged and/or the MD5sums erronous, run datacomp t filename, as see if you get the "CRC32 ..... ON" message. If you do, I believe the file is ok.
The best 'EGT-file assurance' procedure I know is to run:
fsum -md5 filename
datacomp t filename
to produce an MD5sum, and then to show that the file you produced it from was ok.
g
-
- Posts: 9
- Joined: Wed Mar 08, 2006 10:21 pm
- Sign-up code: 0
Thank you for your answer
I just meant that the answer of the wilhelm program when I ask it to check the files (krppkr from chessbase DVD , the three krppkq from recent dowloading on E-mule is "damaged"
The second part of your answer does not help me because I have first to learn many things on computers
kalya
I just meant that the answer of the wilhelm program when I ask it to check the files (krppkr from chessbase DVD , the three krppkq from recent dowloading on E-mule is "damaged"
The second part of your answer does not help me because I have first to learn many things on computers
kalya
damaged file check
for those interested, checking for damage in files using either md5 or datacomp is only checking for pyhsical errors. for errors in the table itself (incorrect generation), such as the one wilhelm picked up from chessbase, the only way to pick it up is to use the .tbs file with wilhelm. this will also pick up physical errors because the stats supplied in the .tbs cannot be reproduced by a physically damaged file. hence the preference by many for using .tbs with wilhelm.
it is also possible to test the files you have against the .tbs with tbgen itself. if you have the 6man capable version and enough memory/hd space, you can simply try to generate kpppkpp and kppppkp. you don't even have to let it go anywhere, if it starts doing anything with your hd (writing a temp file for the next in queue table is the first thing tbgen does after checking table availablility and accuracy), your tables (all of them) are fine. this method is quicker than wilhelm for a single table, but even 6 man pawnless require the presence of other tables for chaining, and this method will only tell you that there's an error there, not in exactly which table if there is more than one present. the advantage is that if there's an error, be it physical or in generation, tbgen will remove affected tables and regenerate them using tbs to verify, if you tell it to.
it is also possible to test the files you have against the .tbs with tbgen itself. if you have the 6man capable version and enough memory/hd space, you can simply try to generate kpppkpp and kppppkp. you don't even have to let it go anywhere, if it starts doing anything with your hd (writing a temp file for the next in queue table is the first thing tbgen does after checking table availablility and accuracy), your tables (all of them) are fine. this method is quicker than wilhelm for a single table, but even 6 man pawnless require the presence of other tables for chaining, and this method will only tell you that there's an error there, not in exactly which table if there is more than one present. the advantage is that if there's an error, be it physical or in generation, tbgen will remove affected tables and regenerate them using tbs to verify, if you tell it to.
those who can, do
those who can't, teach
those who can't, teach
-
- Posts: 489
- Joined: Sat Jan 21, 2006 10:43 am
- Sign-up code: 10159
- Location: Reading, UK
- Contact:
EGT-file integrity
You are right to point out that even the combination of an MD5sum-check and a datacomp run _only_ checks that the file has not been corrupted since it was generated - and that this does not check the EGT-generation itself.
However MD5sum-check_plus_datacomp is sufficient to check that the file has not been corrupted since generation and that the MD5sum is the MD5sum of an ok file.
As I've been told that these work at disc-transfer speeds, I'm not looking for quicker ways of doing the test at the moment. As this check does not involve any external source, I think it's better than turning to WILHELM, despite the excellent merits of that software.
The only way to check the generation itself, i.e. the chessic integrity of the EGT, is to do a full verify of the files (with tbgen?) including a check of their compatability with subgame EGTs.
I have partially checked Nalimov's .tbs files by:
a) checking sum of wtm/btm 1-0/=/0-1/broken == index-ranges, and
b) checking that the stats were compatible (for 3-to-5-man and 6-man P-less) with the stats for DTC(onversion)- and DTZ-metric EGTs.
In doing so, I did find one tbs-mistake - in KBPKN I think - and it's still there in error on Rob Hyatt's ftp site. Nalimov's stats-generating software also used to drop multiples of 2^32 positions if the counts were too large, but his latest software does not do that.
g
However MD5sum-check_plus_datacomp is sufficient to check that the file has not been corrupted since generation and that the MD5sum is the MD5sum of an ok file.
As I've been told that these work at disc-transfer speeds, I'm not looking for quicker ways of doing the test at the moment. As this check does not involve any external source, I think it's better than turning to WILHELM, despite the excellent merits of that software.
The only way to check the generation itself, i.e. the chessic integrity of the EGT, is to do a full verify of the files (with tbgen?) including a check of their compatability with subgame EGTs.
I have partially checked Nalimov's .tbs files by:
a) checking sum of wtm/btm 1-0/=/0-1/broken == index-ranges, and
b) checking that the stats were compatible (for 3-to-5-man and 6-man P-less) with the stats for DTC(onversion)- and DTZ-metric EGTs.
In doing so, I did find one tbs-mistake - in KBPKN I think - and it's still there in error on Rob Hyatt's ftp site. Nalimov's stats-generating software also used to drop multiples of 2^32 positions if the counts were too large, but his latest software does not do that.
g
- Kirill Kryukov
- Site Admin
- Posts: 7399
- Joined: Sun Dec 18, 2005 9:58 am
- Sign-up code: 0
- Location: Mishima, Japan
- Contact:
Re: EGT-file integrity
I think the only reliable way to check the chessic integrity of EGTBs is to write an independent program that will loop through all positions, search one ply forward from each position, query the EGTB for the original position and for all ply 1 positions, and compare the results. This program will use the EGTB probing code by Nalimov of course, but this is correct since the engines are also using the same code.guyhaw wrote:The only way to check the generation itself, i.e. the chessic integrity of the EGT, is to do a full verify of the files (with tbgen?) including a check of their compatability with subgame EGTs.
Actually we should check the correctness of not only EGTB files alone, but of the combination EGTB files + probing code. This procedure will do just that. Someone will have to code it though.
KCEC | EGTB Online | 3x3 Chess | 3x4 Chess | 4x4 Chess | Longest Checkmates | EGTB Test Suite | Opening Sampler | EGTB Bounty | NULP
as to verify with tbgen, the format is (query table) tbgen -q [table] with requirement of complete subset of table. that is 100% verify... filesize, correctness, etc. it only verifies one random position within the table, however the code can be altered fairly easily for a complete check, and anyone with inclination to do so can add md5, crc, sha, or any other checksum they care to the source to create a security net.
if compressed (.emd) files, additional requirement of datacomp to (first) uncompress file and subset.
chessbase released at one time (with fritz 7, chess tiger 14, junior 7, shredder 6, and perhaps a few others) a gui app that took care of this and/or generated up to 5 man bases on a mahcine with as little as 256M ram (using mapping code).. that app could also be useful in this process (if you can find it now, and if chessbase will allow it to be shared). that same app is also capable, on modern computers, of generating flawless 5 man complete set (how i got mine, but cd since got destroyed that had app on it :< ) in less time than it takes to download them. modern computers for the purpose of this argument are 3GHz+ with 1GB+ ram.
tbgen is slower than chessbase's gui app, which is based on tbgen, but as an example, on my athlon64 3200+ with 1G ram, complete 4 man tables (33MHz after compression with datacomp e:8192 table for each table) took a little over an hour from scratch. given that this machine is over a year old, the complete generation and compression from scratch using tbgen and datacomp should take no more than 45 minutes on the newest single cpu machines, a rough comparison to constant 7kB/s average download (including search and queueing time), not super-fast, but no looking involved, no waiting in queues, and no excess load on emule OR the uab site, with immediate advantage of useable (uncompressed) tables that you know are error-free. if you have less than 590M ram, i don't suggest using tbgen as time taken is roughly *4, and even a 56k modem can clear 2kB/s average easily.
if compressed (.emd) files, additional requirement of datacomp to (first) uncompress file and subset.
chessbase released at one time (with fritz 7, chess tiger 14, junior 7, shredder 6, and perhaps a few others) a gui app that took care of this and/or generated up to 5 man bases on a mahcine with as little as 256M ram (using mapping code).. that app could also be useful in this process (if you can find it now, and if chessbase will allow it to be shared). that same app is also capable, on modern computers, of generating flawless 5 man complete set (how i got mine, but cd since got destroyed that had app on it :< ) in less time than it takes to download them. modern computers for the purpose of this argument are 3GHz+ with 1GB+ ram.
tbgen is slower than chessbase's gui app, which is based on tbgen, but as an example, on my athlon64 3200+ with 1G ram, complete 4 man tables (33MHz after compression with datacomp e:8192 table for each table) took a little over an hour from scratch. given that this machine is over a year old, the complete generation and compression from scratch using tbgen and datacomp should take no more than 45 minutes on the newest single cpu machines, a rough comparison to constant 7kB/s average download (including search and queueing time), not super-fast, but no looking involved, no waiting in queues, and no excess load on emule OR the uab site, with immediate advantage of useable (uncompressed) tables that you know are error-free. if you have less than 590M ram, i don't suggest using tbgen as time taken is roughly *4, and even a 56k modem can clear 2kB/s average easily.
those who can, do
those who can't, teach
those who can't, teach
-
- Posts: 489
- Joined: Sat Jan 21, 2006 10:43 am
- Sign-up code: 10159
- Location: Reading, UK
- Contact:
krppkr.2.nbw study
I would be interested to hear from someone who has an 'ok' krppkr.2.nbw EGT-file as well as the flawed one emanating from Chessbase as to what the differences are.
Also, if anyone can shed light on how the flawed EGT-file came to be produced, that would be interesting: was this just a flawed CD-burn, or a flawed generation? Does the flawed file pass the datacomp test or not?
Have the other occasional issues about EGT-files apparently not checking out as 'ok' been resolved, or are there still problems out there?
g
Also, if anyone can shed light on how the flawed EGT-file came to be produced, that would be interesting: was this just a flawed CD-burn, or a flawed generation? Does the flawed file pass the datacomp test or not?
Have the other occasional issues about EGT-files apparently not checking out as 'ok' been resolved, or are there still problems out there?
g
-
- Posts: 9
- Joined: Wed Mar 08, 2006 10:21 pm
- Sign-up code: 0
ahh the beauty of incomplete subsets to wreck an otherwise perfect thing.
as far as i can tell , the krppkr.2 file released flawed by chessbase has been replaced. it was pyhsically the same size as the correct one, but had a different date stamp (how you could tell by file on dvd). it passed the datacomp tes, suggesting incorrect generation, yet when uncompressed generated no error from tbgen if there was no complete subset present. only wilhelm picked it up initially, and at first it was thought that wilhelm was at fault. however it turned out that wilhelm was correct, and it gained popularity quite quickly as a result (even though tbgen and datacomp combination is considerably smaller, and tbgen, being the generator, will pick up any incorrect generation if there is a complete subset, and if there is not will proceed to make the complete subset to make the test, given enough resources and correct compilation).
thus it was only found to be incorrect when the complete subset was present, meaning considerably later than chessbase's release because they didn't release the complete subset to krppkr (which would involve almost 250G of files when considering complete 3 to 6 man subset of krppkr) on their cds.
further than that, i don't know, having never bothered to find out, the details, but it is for precisely this reason that i say to everyone who asks: get pawnless before even considering pawned ending tables.
as far as i can tell , the krppkr.2 file released flawed by chessbase has been replaced. it was pyhsically the same size as the correct one, but had a different date stamp (how you could tell by file on dvd). it passed the datacomp tes, suggesting incorrect generation, yet when uncompressed generated no error from tbgen if there was no complete subset present. only wilhelm picked it up initially, and at first it was thought that wilhelm was at fault. however it turned out that wilhelm was correct, and it gained popularity quite quickly as a result (even though tbgen and datacomp combination is considerably smaller, and tbgen, being the generator, will pick up any incorrect generation if there is a complete subset, and if there is not will proceed to make the complete subset to make the test, given enough resources and correct compilation).
thus it was only found to be incorrect when the complete subset was present, meaning considerably later than chessbase's release because they didn't release the complete subset to krppkr (which would involve almost 250G of files when considering complete 3 to 6 man subset of krppkr) on their cds.
further than that, i don't know, having never bothered to find out, the details, but it is for precisely this reason that i say to everyone who asks: get pawnless before even considering pawned ending tables.
those who can, do
those who can't, teach
those who can't, teach
-
- Posts: 489
- Joined: Sat Jan 21, 2006 10:43 am
- Sign-up code: 10159
- Location: Reading, UK
- Contact:
krppkr.2.nbw continued
To clarify (?) what gambit3 is saying then:
1) the current Chessbase DVD for krppkr.2.nbw carries an ok file: good
2) the previous, flawed, krppkr.2.nbw passed datacomp tests and (somehow - not sure how) verifies as 'ok' in the absence of some of KRPPKR's subgame EGT files
3) therefore, it appears that the krppkr.2.nbw generation was flawed, perhaps because of an incorrect or flawed subgame EGT file
Chessbase have not replied yet to my enquiry about this. A comparison of the correct and flawed krppkr.2.nbw would shed some light on what positions and subgame EGTs were involved.
I suppose there is a question of whether the flawed krppkr.2.nbw emanated from Eugene Nalimov or from Chessbase.
1) the current Chessbase DVD for krppkr.2.nbw carries an ok file: good
2) the previous, flawed, krppkr.2.nbw passed datacomp tests and (somehow - not sure how) verifies as 'ok' in the absence of some of KRPPKR's subgame EGT files
3) therefore, it appears that the krppkr.2.nbw generation was flawed, perhaps because of an incorrect or flawed subgame EGT file
Chessbase have not replied yet to my enquiry about this. A comparison of the correct and flawed krppkr.2.nbw would shed some light on what positions and subgame EGTs were involved.
I suppose there is a question of whether the flawed krppkr.2.nbw emanated from Eugene Nalimov or from Chessbase.
to testing with wilhelm:
unfortunately, it seems to be a ''feature'' of wilhelm that if the subset is not present, regardless of accuracy of generation, file will be called flawed if the random test position happens to jump to a subset table that isn't there.
to my methods of testing:
point 1, yes.
point 2, couldn't say for certain, but file was correct size when datacomp decompressed it. to make sure, i generated it myself and removed the old, threw away the dvdafter making a list of files and copying the rest and new generation to another dvd, so idon't even have the file myself anymore, and i bought it! :>
point 3 tbgen picked it up on test mode with complete subset present.
my first mate in 83 came yesterday. poor blargh. thx nalimov. it is game 3 of this pgn.
unfortunately, it seems to be a ''feature'' of wilhelm that if the subset is not present, regardless of accuracy of generation, file will be called flawed if the random test position happens to jump to a subset table that isn't there.
to my methods of testing:
point 1, yes.
point 2, couldn't say for certain, but file was correct size when datacomp decompressed it. to make sure, i generated it myself and removed the old, threw away the dvdafter making a list of files and copying the rest and new generation to another dvd, so idon't even have the file myself anymore, and i bought it! :>
point 3 tbgen picked it up on test mode with complete subset present.
my first mate in 83 came yesterday. poor blargh. thx nalimov. it is game 3 of this pgn.
- Attachments
-
- game.pgn
- (16.21 KiB) Downloaded 347 times
those who can, do
those who can't, teach
those who can't, teach
-
- Posts: 11
- Joined: Sun Feb 19, 2006 6:11 pm
- Sign-up code: 0
(I have snipped your other speculations.)gambit3 wrote: as far as i can tell , the krppkr.2 file released flawed by chessbase has been replaced. it was pyhsically the same size as the correct one, but had a different date stamp (how you could tell by file on dvd). it passed the datacomp tes, suggesting incorrect generation.
It took me some time to organize the 'old' Endspielturbo of chessbase. As I expected, datacomp shows an error in krppkr.nbw.2.emd on the chessbase dvd. I don't know why you claim it passed the datacomp test, did you test it yourself? Anyway, your claim is wrong and there is no base to speculate about an incorrect generation.
Rafael B. Andrist
-
- Posts: 11
- Joined: Sun Feb 19, 2006 6:11 pm
- Sign-up code: 0
What are you talking about?? To clarify this:gambit3 wrote:to testing with wilhelm:
unfortunately, it seems to be a ''feature'' of wilhelm that if the subset is not present, regardless of accuracy of generation, file will be called flawed if the random test position happens to jump to a subset table that isn't there.
The integrity check of Wilhelm is of course independent on subsets.
Rafael B. Andrist
-in.What are you talking about?? To clarify this:
The integrity check of Wilhelm is of course independent on subsets.
Rafael B. Andrist
if you don't have complete subset of [tb], wilhelm will still check its random position, and if that position references another table that is not present, will give an incorrect result. wilhelm will then report as flawed. this is not speculation and non-negotiable. wilhelm only gives correct reports with complete subsets available. try it yourself with kpkp and no subset. see how far you get. (matters neither which kbkb, nor present of ecc/checksum code.) DO try this at home, folks!
apparently, you have no idea what the original endgame turbo CD set was. pre dvd, it was 11 cds. next! the dvd set was already a revision.gambit3 wrote:
as far as i can tell , the krppkr.2 file released flawed by chessbase has been replaced. it was pyhsically the same size as the correct one, but had a different date stamp (how you could tell by file on dvd). it passed the datacomp tes, suggesting incorrect generation.
(I have snipped your other speculations.)
It took me some time to organize the 'old' Endspielturbo of chessbase. As I expected, datacomp shows an error in krppkr.nbw.2.emd on the chessbase dvd.
those who can, do
those who can't, teach
those who can't, teach
-
- Posts: 11
- Joined: Sun Feb 19, 2006 6:11 pm
- Sign-up code: 0
You seem not be able to argue rationally or to explain what you mean, and continue with your baseless claims. I am not going to waste my time with this any longer. The others can make their own conclusions.
Rafael B. Andrist
Rafael B. Andrist
gambit3 wrote:-in.What are you talking about?? To clarify this:
The integrity check of Wilhelm is of course independent on subsets.
Rafael B. Andrist
if you don't have complete subset of [tb], wilhelm will still check its random position, and if that position references another table that is not present, will give an incorrect result. wilhelm will then report as flawed. this is not speculation and non-negotiable. wilhelm only gives correct reports with complete subsets available. try it yourself with kpkp and no subset. see how far you get. (matters neither which kbkb, nor present of ecc/checksum code.) DO try this at home, folks!
apparently, you have no idea what the original endgame turbo CD set was. pre dvd, it was 11 cds. next! the dvd set was already a revision.gambit3 wrote:
as far as i can tell , the krppkr.2 file released flawed by chessbase has been replaced. it was pyhsically the same size as the correct one, but had a different date stamp (how you could tell by file on dvd). it passed the datacomp tes, suggesting incorrect generation.
(I have snipped your other speculations.)
It took me some time to organize the 'old' Endspielturbo of chessbase. As I expected, datacomp shows an error in krppkr.nbw.2.emd on the chessbase dvd.
-
- Posts: 489
- Joined: Sat Jan 21, 2006 10:43 am
- Sign-up code: 10159
- Location: Reading, UK
- Contact:
krppkr.2.nbw ... in summary
I would like to thank Rafael for having the original 'Chessbase' krppkr.2.nbw checked out with datacomp for me.
While some here have been kind enough to refer to me as an "EGT expert", the plaudit is somewhat misplaced today. It's usually me turning to Rafael, Marc B, Eiko Bleicher or John Tamplin for clarification.
Rafael confirms, as I'd always thought, that the corrupt Chessbase krppkr.2.nbw does NOT pass the datacomp test.
So either gambit3 was wrong in asserting that it had passed that test, or I misunderstood what gambit3 was saying.
A comparison of the corrupt and correct krppkr.2.nbw files might indicate what happened to corrupt the file. In the Checkers domain, an incorrect CD-burn was at fault, corrupting some 100 bits of the file. It can happen, which is why the MD5sum/datacomp checks (in that order) are recommended.
My thanks again to Rafael - g
While some here have been kind enough to refer to me as an "EGT expert", the plaudit is somewhat misplaced today. It's usually me turning to Rafael, Marc B, Eiko Bleicher or John Tamplin for clarification.
Rafael confirms, as I'd always thought, that the corrupt Chessbase krppkr.2.nbw does NOT pass the datacomp test.
So either gambit3 was wrong in asserting that it had passed that test, or I misunderstood what gambit3 was saying.
A comparison of the corrupt and correct krppkr.2.nbw files might indicate what happened to corrupt the file. In the Checkers domain, an incorrect CD-burn was at fault, corrupting some 100 bits of the file. It can happen, which is why the MD5sum/datacomp checks (in that order) are recommended.
My thanks again to Rafael - g
why did you waste MY time this long? yes, you authored wilhelm. so? i am telling you what behaviour i have seen. oh, and perhaps there was a bad burn responsible for the dvd file being shorter because the cd file could no longer be accessed. i don't know and don't really care enough to find out because i no longer have the file in question, nor do i need or want it. in any case, it passed through datacomp when it came off the cd yet was reported by wilhelm. go figure.You seem not be able to argue rationally or to explain what you mean, and continue with your baseless claims. I am not going to waste my time with this any longer.
that all aside, i do owe an apology. the dvd set, which i loaned from my brother again, does indeed have the incorrect filesize. in that i was wrong. however, and this is perhaps what i remembered when i started this all: the original cd set, which my father owns, has a file that has the correct filesize and is identical to the dvd file up to the point at which the dvd file terminates.
again, my apologies for the misinformation regarding the incorrect krppkr file, and the flurry of ruffled feathers that resulted.
lastly, please don't ever imply i don't argue logically. i am nothing else if not logical. i just don't have ever piece of information at hand all the time, and my memory is also not perfect. that said, personal attacks, however disguised, imply a lack of logic, hence my absolute loss of respect now. i won't bother with this topic further.
those who can, do
those who can't, teach
those who can't, teach
-
- Posts: 489
- Joined: Sat Jan 21, 2006 10:43 am
- Sign-up code: 10159
- Location: Reading, UK
- Contact:
Rogue kqqnkn.nbb.emd file
A search shows 1 complete source of one kqqnkn.nbb.emd file - and 6 incomplete sources of another.
Any views as to which is correct? g
Any views as to which is correct? g
- Kirill Kryukov
- Site Admin
- Posts: 7399
- Joined: Sun Dec 18, 2005 9:58 am
- Sign-up code: 0
- Location: Mishima, Japan
- Contact:
Re: Rogue kqqnkn.nbb.emd file
The one which is linked from the project page:guyhaw wrote:A search shows 1 complete source of one kqqnkn.nbb.emd file - and 6 incomplete sources of another.
Any views as to which is correct? g
Code: Select all
ed2k://|file|kqqnkn.nbb.emd|606051929|709BE3FFA772B82B3B58AABBCCE07700|/
ed2k://|file|kqqnkn.nbw.emd|130675171|DB505E98D0F71A6AB36EA2448F0E1825|/
KCEC | EGTB Online | 3x3 Chess | 3x4 Chess | 4x4 Chess | Longest Checkmates | EGTB Test Suite | Opening Sampler | EGTB Bounty | NULP
-
- Posts: 489
- Joined: Sat Jan 21, 2006 10:43 am
- Sign-up code: 10159
- Location: Reading, UK
- Contact:
kqnnkn.nbb.emd - rogue file
Thanks: should have thought to link back to the EDONKEY checksums on your project page.
In fact, I'm now seeing 5 sources of the ok file (3 @ 100%) and only 1 of the rogue file.
'AL AZMI' is the source of the rogue file - g
In fact, I'm now seeing 5 sources of the ok file (3 @ 100%) and only 1 of the rogue file.
'AL AZMI' is the source of the rogue file - g
Wrong file ID
It happend to me to compare filelist that I have to one anothers list. Most of common files were in "shared"-status (red) but there were one that was not. We had the same file, same size but different file ID.
So I checked md5sum from my file and it was ok. I sent a note to another user to check that another file, but that was ok too. Heh. Two same file with diffenrent file ID!
So I "rehashed" my file and now my file ID changed to the correct one. That was little bit confusing and I started to wonder if there are more same kind of "wrong IDs".
My wrong file ID might have been generated, when my eMule once jummed and I had to quit it with windows task manager. Maybe eMule was hashing that file then.
Should there be a list of suspect files that should be checked? At least I would like to know if any of my files or file IDs is not ok.
Regards,
TapaniS
So I checked md5sum from my file and it was ok. I sent a note to another user to check that another file, but that was ok too. Heh. Two same file with diffenrent file ID!
So I "rehashed" my file and now my file ID changed to the correct one. That was little bit confusing and I started to wonder if there are more same kind of "wrong IDs".
My wrong file ID might have been generated, when my eMule once jummed and I had to quit it with windows task manager. Maybe eMule was hashing that file then.
Should there be a list of suspect files that should be checked? At least I would like to know if any of my files or file IDs is not ok.
Regards,
TapaniS
-
- Posts: 489
- Joined: Sat Jan 21, 2006 10:43 am
- Sign-up code: 10159
- Location: Reading, UK
- Contact:
EGT Reference Checksum Data
Seems like there can be 'issues' around eDonkey-generated eDonkey signatures for files. I don't understand why though.
KK keeps a list of MD5sum checksums which are believed to be aok.
I have proposed reference 'Data Assurance Certificates' (DACs) which would give:
- MD5sum, eDonkey checksum and datacomp "aok" for each file
thus showing that the MD5sums etc are manifestly correct.
The DAC can then be easily used to MD5sum-verify your EGT-holding.
I have such a DAC for 3-to-5-man, and I believe others are close to being able to being able to produce the DACs for 3-3, 4-2 and 4-2p EGTs.
g
KK keeps a list of MD5sum checksums which are believed to be aok.
I have proposed reference 'Data Assurance Certificates' (DACs) which would give:
- MD5sum, eDonkey checksum and datacomp "aok" for each file
thus showing that the MD5sums etc are manifestly correct.
The DAC can then be easily used to MD5sum-verify your EGT-holding.
I have such a DAC for 3-to-5-man, and I believe others are close to being able to being able to produce the DACs for 3-3, 4-2 and 4-2p EGTs.
g