Entropy vs predictability in PRNGs
$begingroup$
If entropy is the measure of surprise, and a given PRNG has a uniform distribution, then the entropy would be high. So the Mersenne Twister(MT) has high entropy.
But the MT is also predictable. You can retrieve its past bits and predict its future bits.
So is it fair to say that entropy does not imply predictability? And that predictability is something else entirely?
random-number-generator entropy
$endgroup$
add a comment |
$begingroup$
If entropy is the measure of surprise, and a given PRNG has a uniform distribution, then the entropy would be high. So the Mersenne Twister(MT) has high entropy.
But the MT is also predictable. You can retrieve its past bits and predict its future bits.
So is it fair to say that entropy does not imply predictability? And that predictability is something else entirely?
random-number-generator entropy
$endgroup$
$begingroup$
If it's predictable, how can it be surprising?
$endgroup$
– Ella Rose♦
51 mins ago
add a comment |
$begingroup$
If entropy is the measure of surprise, and a given PRNG has a uniform distribution, then the entropy would be high. So the Mersenne Twister(MT) has high entropy.
But the MT is also predictable. You can retrieve its past bits and predict its future bits.
So is it fair to say that entropy does not imply predictability? And that predictability is something else entirely?
random-number-generator entropy
$endgroup$
If entropy is the measure of surprise, and a given PRNG has a uniform distribution, then the entropy would be high. So the Mersenne Twister(MT) has high entropy.
But the MT is also predictable. You can retrieve its past bits and predict its future bits.
So is it fair to say that entropy does not imply predictability? And that predictability is something else entirely?
random-number-generator entropy
random-number-generator entropy
asked 4 hours ago
BastienBastien
533
533
$begingroup$
If it's predictable, how can it be surprising?
$endgroup$
– Ella Rose♦
51 mins ago
add a comment |
$begingroup$
If it's predictable, how can it be surprising?
$endgroup$
– Ella Rose♦
51 mins ago
$begingroup$
If it's predictable, how can it be surprising?
$endgroup$
– Ella Rose♦
51 mins ago
$begingroup$
If it's predictable, how can it be surprising?
$endgroup$
– Ella Rose♦
51 mins ago
add a comment |
2 Answers
2
active
oldest
votes
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string, there is no computational aspect there.
But if MT is predictable, it means with "few" bits, you can retrieve the full chain, then it could not have high entropy.
In fact an efficient PRNG could not have too much "high" entropy (because the seed/secret key should not be too long), but it doesn't mean there is an efficient attack to retrieve the full string.
New contributor
$endgroup$
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string
this appears to be the definition of kolmogorov complexity, rather than entropy.
$endgroup$
– Ella Rose♦
49 mins ago
add a comment |
$begingroup$
No, entropy can imply predictability and sometimes they can be used interchangeably. Although not generally for cryptography.
If you reduce the notion of entropy to that of simple information, it becomes clearer. So the paradigm is entropy = information, and that's the standard computer science definition. Consider the Wikipedia entry:-
Information entropy is the average rate at which information is produced by a stochastic source of data.
The sometimes confusing issue though is who is surprised, at what point and what is the source? I now think of it as known entropy (high predictability) and unknown entropy (low predictability). The a priori /a posteriori experience is crucial. Three examples of sources:-
- Shakespeare. He produced about 8 Mbits of information in his complete works, which accrued at approximately 1.5 bits per scratchy character. So H(Shakespeare) = 1.5 bits/byte, yet we know all his work. Therefore and simultaneously, we can also say that H(Shakespeare) = 0 if we were to use it for cryptography. So no surprisal.
- AES CTR. A simple AES PRNG with a 128 bit key can produce $2^{128}$ unique streams of data. You would say that H(AES-PRNG) = 128 bits for any length of stream, yet to store that stream (simply or sans key) you would consider a H(AES-PRNG) of 8 bits/byte.
- True random number generator. There may be no key of any sort, and the source is a physical device. Surprise is maximal for anyone reading the output with H(TRNG) = 8 bits/byte. Everyone is therefore surprised at all points and there is no known entropy. It's all unpredictable.
In the case of entropy, like pornography, Justice Potter Stewart said that you'll know it when you see it. Unless you're designing TRNG's, it's common to just assume that in cryptography entropy is the key/seed or the device that created the key/seed. And that it's a complete surprise to an attacker.
$endgroup$
$begingroup$
So would things like MT-based PRNGs be categorized as Known Entropy and CSPRNGs be categorized as Unknown Entropy?
$endgroup$
– Bastien
28 mins ago
$begingroup$
@Bastien No, that's not what I meant. They would both fall (with proviso) into category 2 when considering the surprisal for an attacker. Their entropies would be equal to their seeds. The proviso: MT mustn't be used for crypto as it's output entropy linearly drops to 0 after 624 outputs.
$endgroup$
– Paul Uszak
20 mins ago
$begingroup$
Ok, I think I understand. Entropy is high for both but MT-based PRNGs lose entropy over time due to lack of forward secrecy and prediction resistance, where as a CSPRNG's entropy remains high because it does have those two properties. Do I have the right general idea?
$endgroup$
– Bastien
3 mins ago
add a comment |
Your Answer
StackExchange.ifUsing("editor", function () {
return StackExchange.using("mathjaxEditing", function () {
StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix) {
StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
});
});
}, "mathjax-editing");
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "281"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: false,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: null,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
noCode: true, onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e) {
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom)) {
StackExchange.using('gps', function() { StackExchange.gps.track('embedded_signup_form.view', { location: 'question_page' }); });
$window.unbind('scroll', onScroll);
}
};
$window.on('scroll', onScroll);
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f66525%2fentropy-vs-predictability-in-prngs%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string, there is no computational aspect there.
But if MT is predictable, it means with "few" bits, you can retrieve the full chain, then it could not have high entropy.
In fact an efficient PRNG could not have too much "high" entropy (because the seed/secret key should not be too long), but it doesn't mean there is an efficient attack to retrieve the full string.
New contributor
$endgroup$
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string
this appears to be the definition of kolmogorov complexity, rather than entropy.
$endgroup$
– Ella Rose♦
49 mins ago
add a comment |
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string, there is no computational aspect there.
But if MT is predictable, it means with "few" bits, you can retrieve the full chain, then it could not have high entropy.
In fact an efficient PRNG could not have too much "high" entropy (because the seed/secret key should not be too long), but it doesn't mean there is an efficient attack to retrieve the full string.
New contributor
$endgroup$
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string
this appears to be the definition of kolmogorov complexity, rather than entropy.
$endgroup$
– Ella Rose♦
49 mins ago
add a comment |
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string, there is no computational aspect there.
But if MT is predictable, it means with "few" bits, you can retrieve the full chain, then it could not have high entropy.
In fact an efficient PRNG could not have too much "high" entropy (because the seed/secret key should not be too long), but it doesn't mean there is an efficient attack to retrieve the full string.
New contributor
$endgroup$
The entropy of a random string is the number of bits you need to describe the full string, there is no computational aspect there.
But if MT is predictable, it means with "few" bits, you can retrieve the full chain, then it could not have high entropy.
In fact an efficient PRNG could not have too much "high" entropy (because the seed/secret key should not be too long), but it doesn't mean there is an efficient attack to retrieve the full string.
New contributor
New contributor
answered 3 hours ago
IevgeniIevgeni
212
212
New contributor
New contributor
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string
this appears to be the definition of kolmogorov complexity, rather than entropy.
$endgroup$
– Ella Rose♦
49 mins ago
add a comment |
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string
this appears to be the definition of kolmogorov complexity, rather than entropy.
$endgroup$
– Ella Rose♦
49 mins ago
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string
this appears to be the definition of kolmogorov complexity, rather than entropy.$endgroup$
– Ella Rose♦
49 mins ago
$begingroup$
The entropy of a random string is the number of bits you need to describe the full string
this appears to be the definition of kolmogorov complexity, rather than entropy.$endgroup$
– Ella Rose♦
49 mins ago
add a comment |
$begingroup$
No, entropy can imply predictability and sometimes they can be used interchangeably. Although not generally for cryptography.
If you reduce the notion of entropy to that of simple information, it becomes clearer. So the paradigm is entropy = information, and that's the standard computer science definition. Consider the Wikipedia entry:-
Information entropy is the average rate at which information is produced by a stochastic source of data.
The sometimes confusing issue though is who is surprised, at what point and what is the source? I now think of it as known entropy (high predictability) and unknown entropy (low predictability). The a priori /a posteriori experience is crucial. Three examples of sources:-
- Shakespeare. He produced about 8 Mbits of information in his complete works, which accrued at approximately 1.5 bits per scratchy character. So H(Shakespeare) = 1.5 bits/byte, yet we know all his work. Therefore and simultaneously, we can also say that H(Shakespeare) = 0 if we were to use it for cryptography. So no surprisal.
- AES CTR. A simple AES PRNG with a 128 bit key can produce $2^{128}$ unique streams of data. You would say that H(AES-PRNG) = 128 bits for any length of stream, yet to store that stream (simply or sans key) you would consider a H(AES-PRNG) of 8 bits/byte.
- True random number generator. There may be no key of any sort, and the source is a physical device. Surprise is maximal for anyone reading the output with H(TRNG) = 8 bits/byte. Everyone is therefore surprised at all points and there is no known entropy. It's all unpredictable.
In the case of entropy, like pornography, Justice Potter Stewart said that you'll know it when you see it. Unless you're designing TRNG's, it's common to just assume that in cryptography entropy is the key/seed or the device that created the key/seed. And that it's a complete surprise to an attacker.
$endgroup$
$begingroup$
So would things like MT-based PRNGs be categorized as Known Entropy and CSPRNGs be categorized as Unknown Entropy?
$endgroup$
– Bastien
28 mins ago
$begingroup$
@Bastien No, that's not what I meant. They would both fall (with proviso) into category 2 when considering the surprisal for an attacker. Their entropies would be equal to their seeds. The proviso: MT mustn't be used for crypto as it's output entropy linearly drops to 0 after 624 outputs.
$endgroup$
– Paul Uszak
20 mins ago
$begingroup$
Ok, I think I understand. Entropy is high for both but MT-based PRNGs lose entropy over time due to lack of forward secrecy and prediction resistance, where as a CSPRNG's entropy remains high because it does have those two properties. Do I have the right general idea?
$endgroup$
– Bastien
3 mins ago
add a comment |
$begingroup$
No, entropy can imply predictability and sometimes they can be used interchangeably. Although not generally for cryptography.
If you reduce the notion of entropy to that of simple information, it becomes clearer. So the paradigm is entropy = information, and that's the standard computer science definition. Consider the Wikipedia entry:-
Information entropy is the average rate at which information is produced by a stochastic source of data.
The sometimes confusing issue though is who is surprised, at what point and what is the source? I now think of it as known entropy (high predictability) and unknown entropy (low predictability). The a priori /a posteriori experience is crucial. Three examples of sources:-
- Shakespeare. He produced about 8 Mbits of information in his complete works, which accrued at approximately 1.5 bits per scratchy character. So H(Shakespeare) = 1.5 bits/byte, yet we know all his work. Therefore and simultaneously, we can also say that H(Shakespeare) = 0 if we were to use it for cryptography. So no surprisal.
- AES CTR. A simple AES PRNG with a 128 bit key can produce $2^{128}$ unique streams of data. You would say that H(AES-PRNG) = 128 bits for any length of stream, yet to store that stream (simply or sans key) you would consider a H(AES-PRNG) of 8 bits/byte.
- True random number generator. There may be no key of any sort, and the source is a physical device. Surprise is maximal for anyone reading the output with H(TRNG) = 8 bits/byte. Everyone is therefore surprised at all points and there is no known entropy. It's all unpredictable.
In the case of entropy, like pornography, Justice Potter Stewart said that you'll know it when you see it. Unless you're designing TRNG's, it's common to just assume that in cryptography entropy is the key/seed or the device that created the key/seed. And that it's a complete surprise to an attacker.
$endgroup$
$begingroup$
So would things like MT-based PRNGs be categorized as Known Entropy and CSPRNGs be categorized as Unknown Entropy?
$endgroup$
– Bastien
28 mins ago
$begingroup$
@Bastien No, that's not what I meant. They would both fall (with proviso) into category 2 when considering the surprisal for an attacker. Their entropies would be equal to their seeds. The proviso: MT mustn't be used for crypto as it's output entropy linearly drops to 0 after 624 outputs.
$endgroup$
– Paul Uszak
20 mins ago
$begingroup$
Ok, I think I understand. Entropy is high for both but MT-based PRNGs lose entropy over time due to lack of forward secrecy and prediction resistance, where as a CSPRNG's entropy remains high because it does have those two properties. Do I have the right general idea?
$endgroup$
– Bastien
3 mins ago
add a comment |
$begingroup$
No, entropy can imply predictability and sometimes they can be used interchangeably. Although not generally for cryptography.
If you reduce the notion of entropy to that of simple information, it becomes clearer. So the paradigm is entropy = information, and that's the standard computer science definition. Consider the Wikipedia entry:-
Information entropy is the average rate at which information is produced by a stochastic source of data.
The sometimes confusing issue though is who is surprised, at what point and what is the source? I now think of it as known entropy (high predictability) and unknown entropy (low predictability). The a priori /a posteriori experience is crucial. Three examples of sources:-
- Shakespeare. He produced about 8 Mbits of information in his complete works, which accrued at approximately 1.5 bits per scratchy character. So H(Shakespeare) = 1.5 bits/byte, yet we know all his work. Therefore and simultaneously, we can also say that H(Shakespeare) = 0 if we were to use it for cryptography. So no surprisal.
- AES CTR. A simple AES PRNG with a 128 bit key can produce $2^{128}$ unique streams of data. You would say that H(AES-PRNG) = 128 bits for any length of stream, yet to store that stream (simply or sans key) you would consider a H(AES-PRNG) of 8 bits/byte.
- True random number generator. There may be no key of any sort, and the source is a physical device. Surprise is maximal for anyone reading the output with H(TRNG) = 8 bits/byte. Everyone is therefore surprised at all points and there is no known entropy. It's all unpredictable.
In the case of entropy, like pornography, Justice Potter Stewart said that you'll know it when you see it. Unless you're designing TRNG's, it's common to just assume that in cryptography entropy is the key/seed or the device that created the key/seed. And that it's a complete surprise to an attacker.
$endgroup$
No, entropy can imply predictability and sometimes they can be used interchangeably. Although not generally for cryptography.
If you reduce the notion of entropy to that of simple information, it becomes clearer. So the paradigm is entropy = information, and that's the standard computer science definition. Consider the Wikipedia entry:-
Information entropy is the average rate at which information is produced by a stochastic source of data.
The sometimes confusing issue though is who is surprised, at what point and what is the source? I now think of it as known entropy (high predictability) and unknown entropy (low predictability). The a priori /a posteriori experience is crucial. Three examples of sources:-
- Shakespeare. He produced about 8 Mbits of information in his complete works, which accrued at approximately 1.5 bits per scratchy character. So H(Shakespeare) = 1.5 bits/byte, yet we know all his work. Therefore and simultaneously, we can also say that H(Shakespeare) = 0 if we were to use it for cryptography. So no surprisal.
- AES CTR. A simple AES PRNG with a 128 bit key can produce $2^{128}$ unique streams of data. You would say that H(AES-PRNG) = 128 bits for any length of stream, yet to store that stream (simply or sans key) you would consider a H(AES-PRNG) of 8 bits/byte.
- True random number generator. There may be no key of any sort, and the source is a physical device. Surprise is maximal for anyone reading the output with H(TRNG) = 8 bits/byte. Everyone is therefore surprised at all points and there is no known entropy. It's all unpredictable.
In the case of entropy, like pornography, Justice Potter Stewart said that you'll know it when you see it. Unless you're designing TRNG's, it's common to just assume that in cryptography entropy is the key/seed or the device that created the key/seed. And that it's a complete surprise to an attacker.
answered 49 mins ago
Paul UszakPaul Uszak
7,22911535
7,22911535
$begingroup$
So would things like MT-based PRNGs be categorized as Known Entropy and CSPRNGs be categorized as Unknown Entropy?
$endgroup$
– Bastien
28 mins ago
$begingroup$
@Bastien No, that's not what I meant. They would both fall (with proviso) into category 2 when considering the surprisal for an attacker. Their entropies would be equal to their seeds. The proviso: MT mustn't be used for crypto as it's output entropy linearly drops to 0 after 624 outputs.
$endgroup$
– Paul Uszak
20 mins ago
$begingroup$
Ok, I think I understand. Entropy is high for both but MT-based PRNGs lose entropy over time due to lack of forward secrecy and prediction resistance, where as a CSPRNG's entropy remains high because it does have those two properties. Do I have the right general idea?
$endgroup$
– Bastien
3 mins ago
add a comment |
$begingroup$
So would things like MT-based PRNGs be categorized as Known Entropy and CSPRNGs be categorized as Unknown Entropy?
$endgroup$
– Bastien
28 mins ago
$begingroup$
@Bastien No, that's not what I meant. They would both fall (with proviso) into category 2 when considering the surprisal for an attacker. Their entropies would be equal to their seeds. The proviso: MT mustn't be used for crypto as it's output entropy linearly drops to 0 after 624 outputs.
$endgroup$
– Paul Uszak
20 mins ago
$begingroup$
Ok, I think I understand. Entropy is high for both but MT-based PRNGs lose entropy over time due to lack of forward secrecy and prediction resistance, where as a CSPRNG's entropy remains high because it does have those two properties. Do I have the right general idea?
$endgroup$
– Bastien
3 mins ago
$begingroup$
So would things like MT-based PRNGs be categorized as Known Entropy and CSPRNGs be categorized as Unknown Entropy?
$endgroup$
– Bastien
28 mins ago
$begingroup$
So would things like MT-based PRNGs be categorized as Known Entropy and CSPRNGs be categorized as Unknown Entropy?
$endgroup$
– Bastien
28 mins ago
$begingroup$
@Bastien No, that's not what I meant. They would both fall (with proviso) into category 2 when considering the surprisal for an attacker. Their entropies would be equal to their seeds. The proviso: MT mustn't be used for crypto as it's output entropy linearly drops to 0 after 624 outputs.
$endgroup$
– Paul Uszak
20 mins ago
$begingroup$
@Bastien No, that's not what I meant. They would both fall (with proviso) into category 2 when considering the surprisal for an attacker. Their entropies would be equal to their seeds. The proviso: MT mustn't be used for crypto as it's output entropy linearly drops to 0 after 624 outputs.
$endgroup$
– Paul Uszak
20 mins ago
$begingroup$
Ok, I think I understand. Entropy is high for both but MT-based PRNGs lose entropy over time due to lack of forward secrecy and prediction resistance, where as a CSPRNG's entropy remains high because it does have those two properties. Do I have the right general idea?
$endgroup$
– Bastien
3 mins ago
$begingroup$
Ok, I think I understand. Entropy is high for both but MT-based PRNGs lose entropy over time due to lack of forward secrecy and prediction resistance, where as a CSPRNG's entropy remains high because it does have those two properties. Do I have the right general idea?
$endgroup$
– Bastien
3 mins ago
add a comment |
Thanks for contributing an answer to Cryptography Stack Exchange!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
Use MathJax to format equations. MathJax reference.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e) {
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom)) {
StackExchange.using('gps', function() { StackExchange.gps.track('embedded_signup_form.view', { location: 'question_page' }); });
$window.unbind('scroll', onScroll);
}
};
$window.on('scroll', onScroll);
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fcrypto.stackexchange.com%2fquestions%2f66525%2fentropy-vs-predictability-in-prngs%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e) {
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom)) {
StackExchange.using('gps', function() { StackExchange.gps.track('embedded_signup_form.view', { location: 'question_page' }); });
$window.unbind('scroll', onScroll);
}
};
$window.on('scroll', onScroll);
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e) {
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom)) {
StackExchange.using('gps', function() { StackExchange.gps.track('embedded_signup_form.view', { location: 'question_page' }); });
$window.unbind('scroll', onScroll);
}
};
$window.on('scroll', onScroll);
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
var $window = $(window),
onScroll = function(e) {
var $elem = $('.new-login-left'),
docViewTop = $window.scrollTop(),
docViewBottom = docViewTop + $window.height(),
elemTop = $elem.offset().top,
elemBottom = elemTop + $elem.height();
if ((docViewTop elemBottom)) {
StackExchange.using('gps', function() { StackExchange.gps.track('embedded_signup_form.view', { location: 'question_page' }); });
$window.unbind('scroll', onScroll);
}
};
$window.on('scroll', onScroll);
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
$begingroup$
If it's predictable, how can it be surprising?
$endgroup$
– Ella Rose♦
51 mins ago