Score:1

Recovering nonce in ECDSA with known shared components in ECDSA

so flag

This is a slight variation of the shared nonce problem. We do have a lot more shared information between two nonces.

Given a random $k$:

$$ k_1 = ka, k_2 = kb $$

I am now signing two messages, which gives me $s_1,r_1,s_2,r_2$

Based on my understanding of the base equation for signatures in ECDSA (with given generator $G$, private key $d$)

$$ r=kG, s=k^-1(h+rd) $$

So now I have two equations, which I can use to resolve $k$:

$$ \begin{align} s_1-s_2&=k_1^{-1}(h_1+r_1d)-k_2^{-1}(h_2+r_2d) \\ s_1-s_2&=k_1^{-1}h_1+k_1^{-1}r_1d-k_2^{-1}h_2-k_2^{-1}r_2d &&\text{expand $k_1,k_2$}\\ s_1-s_2&=k^{-1}a^{-1}h_1 + k^{-1}a^{-1}r_1d - k^{-1}b^{-1}h_2 - k^{-1}b^{-1}r_2d &&\text{expand $r_1,r_2$}\\ s_1-s_2&=k^{-1}a^{-1}h_1 + k^-1a^{-1}(kaG)d - k^{-1}b^{-1}h_2 - k^{-1}b^{-1}(kbG)d \\ s_1-s_2&=k^{-1}a^{-1}h_1 + k^{-1}ka^{-1}aGd - k^{-1}b^{-1}h_2 - k^{-1}kb^{-1}bGd \\ s_1-s_2&=k^{-1}a^{-1}h_1 + k^{-1}ka^{-1}aGd - k^{-1}kb^{-1}bGd -k^{-1}b^{-1}h_2 \\ s_1-s_2&=k^{-1}a^{-1}h_1 + Gd - Gd -k^{-1}b^{-1}h_2 \\ s_1-s_2&=k^{-1}a^{-1}h_1 -k^{-1}b^{-1}h_2 \\ k(s_1-s_2)&=a^{-1}h_1 - b^{-1}h_2 \\ k&=(a^{-1}h_1 - b^{-1}h_2)(s_1-s_2)^{-1} \end{align} $$

I tried implementing this in python:

import ecdsa
import random
import libnum
import hashlib
import sys

G = ecdsa.ecdsa.generator_256
order = G.order()
priv1 = random.randrange(1,order)
 
Public_key = ecdsa.ecdsa.Public_key(G, G * priv1)
x1 = ecdsa.ecdsa.Private_key(Public_key, priv1)
k = random.randrange(1, 2**127)
msg1="testmessage one"
msg2="testmessage two"
    
h1 = bytes_to_long(hashlib.sha1(msg1.encode()).digest())
h2 = bytes_to_long(hashlib.sha1(msg2.encode()).digest())
a=101
b=197


sig1 = x1.sign(h1, k*a)
sig2 = x1.sign(h2, k*b)

r1,s1 = sig1.r,sig1.s
r2,s2 = sig2.r,sig2.s

k_recovered = (((libnum.invmod(int(a),order)*(h1))%order-(libnum.invmod(int(b),order)*(h2)%order))*libnum.invmod( (s1-s2),order))%order

print ("\nk: \t\t",k)
print ("k recovered \t",k_recovered)

Which gives me a wrong k

k:       11380758029406828810642876408403002369
k recovered      92413802760778512715100399489368323379693045694765104020036290177818159224142

I've been staring at this way too long and could use a second pair of eyes. Am I missing something in the base math or did I mess up the implementation?

(ps, this is somewhat similar to the problems discussed here

kelalaka avatar
in flag
Welcome to [cryptography.se]. What is the origin of this question? There is always such a relation between any two nonces as long as $k$ is from a multiplicative group.
kazamatzuri avatar
so flag
@kelalaka i have the two plaintexts $m_1,m_2$ and know $a,b$, so I need to recover the original $k$. (original is from a past ctf, i'm trying to learn and get better at them)
kelalaka avatar
in flag
If you know the messages, $a$ and $b$ then you have almost there, find $h$s.
kazamatzuri avatar
so flag
@kelalaka, I have the hs given as well. I'm asking what am i missing, to determine the original underlying $k$. (see the last line in the equations i wrote).
Score:3
jp flag
Lev

You are mixing up the group operation of the curve and field arithmetic operations. Multiplying the $x$-coordinate of a point by a scalar does not yield a corresponding $x$-coordinate of a point multiplied by the same scalar. So in general it is not true that $$aG_x = (aG)_x$$ Where I denote the $x$-coordinate of a point $P$ as $P_x$. On the left hand side we are multiplying the two field elements (x coordinate and a), and on the right hand side we are doing scalar point multiplication

So in your lines 3-5, where $r = (k_iG)_x$, it is not necessarily true that the terms simplify. $$dk_i^{-1}r_i = dk_i^{-1}(k_iG)_x \neq (dG)_x$$

Something else is needed here to obtain your final $k$.

EDIT: To solve this, since you know $s_1, s_2, r_1, r_2, h_1, h_2, a, b$, you need to solve the following system:

$$s_1 = k^{-1}a^{-1}(h_1 + r_1d)$$ $$s_2 = k^{-1}b^{-1}(h_2 + r_2d)$$

EDIT 2: Here is some code

import ecdsa
import random
import libnum
import hashlib
import sys
import math
from Crypto.Util.number import bytes_to_long 

G = ecdsa.ecdsa.generator_256
order = G.order()
priv1 = random.randrange(1,order)
 
Public_key = ecdsa.ecdsa.Public_key(G, G * priv1)
x1 = ecdsa.ecdsa.Private_key(Public_key, priv1)
k = random.randrange(1, 2**127)
msg1="testmessage one"
msg2="testmessage two"
    
h1 = bytes_to_long(hashlib.sha1(msg1.encode()).digest())
h2 = bytes_to_long(hashlib.sha1(msg2.encode()).digest())
a=101
b=197


sig1 = x1.sign(h1, k*a)
sig2 = x1.sign(h2, k*b)

r1,s1 = sig1.r,sig1.s
r2,s2 = sig2.r,sig2.s

def inv(x):
    return libnum.invmod(x, order)

d_recovered = (inv(s1*a)*h1 - inv(s2*b)*h2)*inv(inv(s2*b)*r2 - inv(s1*a)*r1) % order
print ("d recovered \t",d_recovered)


print ("\nd: \t\t",priv1)
Score:0
de flag

Your equations will be simpler to solve if you avoid having lots of inverses in them. Also, I find equations much more readable when inverses are written as a division.

So, these are the two equations you start with:

$$ s_1 a k = h_1 + r_1 d \\ s_2 b k = h_2 + r_2 d \\ $$

multiply the first by $r_2$ and the second by $r_1$

$$ r_2 s_1 a k = r_2 h_1 + r_2 r_1 d \\ r_1 s_2 b k = r_1 h_2 + r_1 r_2 d \\ $$

Then subtract:

$$ r_2 s_1 a k - r_1 s_2 b k = r_2 h_1 - r_1 h_2 + r_2 r_1 d - r_1 r_2 d $$

Collect terms and note that the $d$ term disappears.

$$ (r_2 s_1 a - r_1 s_2 b ) k = r_2 h_1 - r_1 h_2 $$

So: $$ k = {{h_1 r_2 - h_2 r_1} \over {s_1 a r_2 - s_2 b r_1}} $$

Similarly, you can derive this equation for:

$$ d = {{ s_2 b h_1 - s_1 a h_2 } \over {s_1 a r_2 - s_2 b r_1}} $$

I sit in a Tesla and translated this thread with Ai:

mangohost

Post an answer

Most people don’t grasp that asking a lot of questions unlocks learning and improves interpersonal bonding. In Alison’s studies, for example, though people could accurately recall how many questions had been asked in their conversations, they didn’t intuit the link between questions and liking. Across four studies, in which participants were engaged in conversations themselves or read transcripts of others’ conversations, people tended not to realize that question asking would influence—or had influenced—the level of amity between the conversationalists.