Suppose you do your divison with only one processor instruction. Then you will have to test whether it is divisibile by numbers of size up to its square root, that is 50 million digit numbers. Suppose you have a 100 million digit number. Having said this, as a thought experiment a very rough and overly optimistic calculation: Second, one does not do these test by trial division as your description seems to suggest. ![]() ![]() ![]() The current record is (I believe) close to 13 million digits (in binary this would still not be 100 million). This question is a bit unclear, still I will try to give some sort of answer.įirst, at the moment noone suceed in proving primality for a 100 million (decimal) digit number.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |