In an era where digital security is paramount, Shamir's Secret Sharing (SSS) stands as a revolutionary technique, offering robust protection for sensitive data. This cryptographic method, developed by Adi Shamir, is not just a tool but a safeguard, ensuring that a secret is not just hidden but also shared securely.
In this article, we delve into the ingenious mechanics of SSS, exploring its foundations in polynomial mathematics and its practical applications in securing data, from corporate secrets to personal credentials. In our case, we will save credentials for the user account, aka mnemonic phrase.
This article will be divided into several digestible parts. Ultimately, we will create one service for working with SSN, which includes the following microservices:
Auth: for exchanging a JWT token for an internal identifier
Generator: This gives the user addresses where he can get his parts of the key
Shares: that is responsible for storing shares
Metadata: responsible for storing other information necessary for work
Disclaimer: When I was creating this service, I was very much inspired by these guys: https://web3auth.io/safeauth.html , so for production development, I would recommend them more and use a self-written service only if there is a security audit that will analyze each of your actions. Our company uses a self-written solution for several reasons, one is that it is better to have your own solution in conditions of modern uncertainty. We were audited by the Hallward company, and I hope that we will soon be able to open-source our implementation
In the first part (part 0), I want to dive into SSS in a basic way and show the basic formulas for working with polynomial mathematics using JavaScript as usual
Part 1. will contain works with auth and generator microservices. We will change the user JWT token into our token, check some credentials for using our service, and choose where they will be stored user shares.
Part 2. In this chapter, I’m going to dive into the storage part of our service
First and most important - safely storing user's share into the blockchain
Second, store other user's data into metadata microservice, aka blockchain
Shamir's Secret Sharing (SSS), developed by Adi Shamir in 1979, is a cryptographic method that divides a secret into multiple parts, known as "shares." The fundamental idea behind SSS is that the secret can only be reconstructed when a sufficient number of these shares are combined. Here's a breakdown of how it works:
Key Properties of SSS:
SSS is beneficial when security and reliability are paramount, such as securing cryptographic keys, safeguarding access to critical digital assets, or other cases you can imagine yourself.
About number schemas: we can use different threshold values, but we need to know that the more parts of the key we create, the more likely a hacker will obtain the required number of key parts and assemble the original key.
In our service, we will use the following schemas for storing data:
On the user’s side, we will use scheme 3-2. This means that the private key will be divided into 3 different parts with indexes. And to restore the original private key, we need at least 2 of them.
For further convenience, I suggest using the following names for our 3 parts:
As mentioned, the user needs only 2 of 3 shares to work with a private key. For example, if a user has “user share” and “cloud share”, he doesn’t need to get “social share” for work, or if he restores on a new device or browser, he needs “social share” and “cloud share”
On the backend for “social share” we will use 5-3. This means that we divide social share into 5 parts, and to restore the original key, we need only 3 of them. And all of them will be saved on different storages.
As has been said earlier, our service is built from 4 microservices: Auth, Generator, Shares, and Metadata.
Shares service can be stored by different participants of our service. In a perfect world, it should have different providers for deploy different storage like PostgreSQL, Ethereum smart contract, and others (in our solution, we will use only smart contracts), But no one is limited to choose their own storage.
First, users need a JWT token; for example, it can be taken from auth with Google or Apple auth services.
After we get JWT, we exchange it for an internal JWT token.
Why so hard? Good question. One of this service's central ideas is decentralization, which applies to the blockchain and the service for storing private key shares. However, we are not going to share the user's credentials with external services, and for this reason, we change the actual identifiers to internal. Also, we checked the credentials that only allowed OAuth providers to be limited.
Now, we will check whether there is a saved key; if not, we suggest creating it. After creation, we divide one of the parts into five according to the same principle and distribute them between services and save the critical index, save another part, the locale, and offer the third part to be held somewhere by the user (there are a lot of options on how this can be done)
If we understand that the user has already saved his private key, go to the shares service and get a list of services where the user shares and their indexes are stored.
Get shares. After Lagrange interpolation, we have the first part of the share. But we don’t know its index.
Here, the metadata service comes to our aid, from which, based on the key we receive, we will receive the index of this very key.
The next step is to find the locally stored part or ask the user to provide it.
We restore the original private key if we receive the required number of critical parts.
Initially, I did not plan the code for this part of the article, but I still decided that the essential functions for working with Lagrange interpolation would look very nice here since it underlies the entire project.
export class Polynomial {
shares: BN[];
polynomialId: string;
static initialize(
privateKey: BN | Buffer | null,
threshold: number,
) {
const pk = privateKey ? privateKey : randomBytes(32);
const shares = [new BN(pk)];
for (let i = 1; i < threshold; i += 1) {
const share = randomBytes(32);
shares.push(new BN(share));
}
const polynomialId = keccak256(...shares.map(bn => bn.toString()));
return new Polynomial(shares, polynomialId);
}
static fromShares(shares: Share[]) {
const unsortedPoints = shares.map<PolynomialPoint>(s => ({
x: new BN(s.shareIndex, 'hex'),
y: new BN(s.share, 'hex'),
}));
const sortedPoints = pointSort(unsortedPoints);
const polynomial = generateEmptyBNArray(sortedPoints.length);
for (let i = 0; i < sortedPoints.length; i += 1) {
const coefficients = interpolationPoly(i, sortedPoints);
for (let k = 0; k < sortedPoints.length; k += 1) {
let tmp = new BN(sortedPoints[i].y);
tmp = tmp.mul(coefficients[k]);
polynomial[k] = polynomial[k].add(tmp).umod(curveN);
}
}
const polynomialId = keccak256(...polynomial.map(bn => bn.toString()));
return new Polynomial(polynomial, polynomialId);
}
constructor(shares: BN[], polynomialId: string = '') {
this.shares = shares;
this.polynomialId = polynomialId;
}
getPrivateKey(): BN {
return this.shares[0];
}
getShare(x: string | BN): Share {
const tmpX = new BN(x, 'hex');
let xi = new BN(tmpX);
let sum = new BN(this.shares[0]);
for (let i = 1; i < this.shares.length; i += 1) {
sum = sum.add(xi.mul(this.shares[i]));
xi = xi.mul(tmpX);
}
return {
share: sum.umod(curveN)?.toString('hex')?.padStart?.(64, '0'),
shareIndex: tmpX.toString('hex'),
polynomialID: this.polynomialId,
};
}
}
const pointSort = (innerPoints: PolynomialPoint[]): PolynomialPoint[] => {
const pointArrClone = [...innerPoints];
pointArrClone.sort((a, b) => a.x.cmp(b.x));
return pointArrClone;
};
const generateEmptyBNArray = (length: number): BN[] =>
Array.from({length}, () => new BN(0));
const denominator = (i: number, innerPoints: PolynomialPoint[]) => {
let result = new BN(1);
const xi = innerPoints[i].x;
for (let j = innerPoints.length - 1; j >= 0; j -= 1) {
if (i !== j) {
let tmp = new BN(xi);
tmp = tmp.sub(innerPoints[j].x).umod(curveN);
result = result.mul(tmp).umod(curveN);
}
}
return result;
};
const interpolationPoly = (i: number, innerPoints: PolynomialPoint[]): BN[] => {
let coefficients = generateEmptyBNArray(innerPoints.length);
const d = denominator(i, innerPoints);
if (d.cmp(new BN(0)) === 0) {
throw new Error('Denominator for interpolationPoly is 0');
}
coefficients[0] = d.invm(curveN);
for (let k = 0; k < innerPoints.length; k += 1) {
const newCoefficients = generateEmptyBNArray(innerPoints.length);
if (k !== i) {
let j: number;
if (k < i) {
j = k + 1;
} else {
j = k;
}
j -= 1;
for (; j >= 0; j -= 1) {
newCoefficients[j + 1] = newCoefficients[j + 1]
.add(coefficients[j])
.umod(curveN);
let tmp = new BN(innerPoints[k].x);
tmp = tmp.mul(coefficients[j]).umod(curveN);
newCoefficients[j] = newCoefficients[j].sub(tmp).umod(curveN);
}
coefficients = newCoefficients;
}
}
return coefficients;
};
Main class is Polynomial which have 2 static methods:
initialize(privateKey, threshold)
: Creates a new polynomial with a specified threshold
number of shares. If privateKey
is not provided, it generates a random one. Each share is a random 32-byte number, and the polynomialId
is computed using keccak256
hash of the shares.fromShares(shares)
: Constructs a polynomial from given shares. It sorts these shares, performs polynomial interpolation, and computes a polynomialId
.polynomialId
in current scheme used for create unique identifier of our shares
And 2 Instance Methods:
getPrivateKey()
: Returns the first share, which is presumably the private key.getShare(x)
: Computes and returns a specific share of the polynomial, given an input x
.Helper Functions:
pointSort(innerPoints)
: Sorts an array of PolynomialPoint
objects based on their x
values.generateEmptyBNArray(length)
: Generates an array of BN
objects initialized to zero, of the specified length
.denominator(i, innerPoints)
: Helper for calculating the denominator in the Lagrange polynomial interpolation.interpolationPoly(i, innerPoints)
: Computes the coefficients for the Lagrange interpolation polynomial for a given point.
I understand that so far it all looks quite crumpled and chaotic, but still at least something =)
For a better understanding of the theory, I recommend taking a look at Wikipedia (or another favorite source of information)
In this article, we took a superficial look at the operating principle of the future service and determined the scope of work that needs to be done in the following articles (tentatively, there will be two more)
See you next time =)