Content uploaded by Syed Badruddoja
Author content
All content in this area was uploaded by Syed Badruddoja on Dec 23, 2022
Content may be subject to copyright.
Scalable Smart Contracts for Linear Regression
Algorithm
Syed Badruddoja, Ram Dantu, Yanyan He,
Abiola Salau, and Kritagya Upadhyay
University of North Texas, Denton, Texas - 76207, USA
Abstract. Linear regression algorithms capture information from pre-
vious experiences and build a cognitive model to forecast the future. The
information and the cognitive model representing the history of predict-
ing future outputs must be reliable so that expected results are trusted.
Furthermore, the algorithms must be explainable and traceable, mak-
ing the learning process meaningful and trackable. Blockchain smart
contracts boost information integrity, providing trust and the prove-
nance of distributed ledger transactions that support such requirements.
Smart contracts are traditionally developed to perform simple transac-
tions with integer operations. However, developing learning algorithms
such as linear regression with smart contracts mandates complex com-
putation involving floating-point operations, which are not supported
by smart contracts. Moreover, smart contract transactions are expensive
and time-consuming. In this work, we propose a novel implementation
of smart contracts for linear regression algorithms with fraction-based
computation that can train and predict on the Ethereum blockchain.
Our smart contract-based training and prediction technique with Solid-
ity programming language produced a similar mean square error to the
scikit-learn-based prediction model. Moreover, our design strategy saves
training costs for linear regression algorithms through off-chain compu-
tations with an optimistic roll-up solution. The off-chain training and
on-chain prediction strategy demonstrated in our work will help aca-
demic and industry researchers to develop cost-effective distributed AI
applications in the future.
Blockchain, dApp, Smart Contract, Artificial Intelligence, Multiple Linear
Regression, Arbitrum, Ethereum
1 Introduction & Motivation
Fabricated Forecast: Artificial intelligence provides methods to make in-
telligent decisions for various applications [1]. The models are prepared with
well-known algorithms proven to yield high accuracy with many modes of learn-
ing techniques. However, one of the crucial problems in recent development in-
volves the trust of the data and model [2]. Data poisoning attacks wreak havoc
in applications demanding predictive intelligence where input data, the training
model, and output data can be questioned [3]. The training of tampered data
fabricates the learning model. The manipulated model forecasts unreliable re-
sults. Moreover, the model of training and prediction results are also targets of
2 S. Badruddoja et al.
attack. For example, if the training of the linear regression model is flawed using
the tampered dataset in a weather forecast center, the forecast would produce
a fake prediction. Therefore, a trusted machine learning model is mandatory to
build confidence in the prediction system [5].
Explainable And Transparent AI: The machine learning models pre-
dominantly suffer from unclear training methods that make the learning process
inexplicable [5]. For instance, in healthcare systems, the severity of diseases (a
regression problem) requires investigation of multiple symptoms that mandates
explainable features for a complete comprehension of the underlying illness [6].
However, the AI application does not explain the learning process. Moreover, the
models also lack provenance to provide proof of learning [7]. Consequently, users
sway away from trusting these applications due to low trust and confidence in AI
applications. Moreover, AI applications raise ethical concerns about biased mod-
els on race, gender, ethnicity, and any feature relevant to the data set. A biased
model can predict the wrong regression value and create more discrimination
among application users [15].
Smart Contracts Scalability: Blockchain addresses trust, provenance,
and explainability of AI [12] application through immutable distributed ledger
and integrity feature. It works as a confidence machine for making a consensus-
based transaction that is secure and intact. However, smart contracts in blockchain
suffer from programming and scalability issues. The solidity programming lan-
guage [24] in Ethereum blockchain( one of the popular programming languages
for developing DApps) is a static programming language that denies floating-
point computations. Therefore, cognitive smart contracts cannot produce accu-
rate predictions. Moreover, the Ethereum blockchain also has scalability issues
with transaction block limit, high computation cost, and delay in the finality of
creation of block [9]. For instance, training a linear regression model with iter-
ative optimization [10] requires thousands of iterations and function updates to
optimize a model. The delay in the transaction time of the blockchain network
deems the training unreasonable, and the higher cost of computations makes it
unaffordable.
2 Problem Definition
A tampered linear regression model falsifies predictions, dissuading users
from trusting AI applications. Hence, a trustable model is mandatory for re-
liable forecasts. A trustworthy model mandates untampered data, transparent
learning, and explainable predictions. Blockchain smart contracts provide im-
mutable, consensus-based, and tamper-proof transactions that can secure lin-
ear regression models. Moreover, a blockchain distributed ledger provides data
provenance, which helps keep the system transparent. However, smart contract
languages restrict such learning capabilities due to a lack of floating-point com-
putations. Consequently, the accuracy of learning a multiple linear regression
model is unreliable and does not produce the intended regression accuracy for
the sake of predictions. Moreover, the scalability of blockchain smart contracts
raises concerns about developing such models, as the training of models tends
to be very expensive and time-consuming.
Scalable Smart Contracts for Linear Regression Algorithm 3
3 Our Contribution
–Despite the limitation of the Solidity programming language, we have trained
the linear regression model with an iterative optimization method in the
smart contract. See section 5
–We have proposed a novel architecture to train linear regression model with
layer two blockchain and predict using layer one blockchain. See Section 5
–We produced comparable training accuracy on blockchain concerning scikit-
learn (python machine learning library) based training. See Section 5
–Our prediction results confirm that smart contracts can predict with high ef-
ficacy compared to scikit-learn prediction (Python machine learning library).
See table 1, figure 3 and 4.
–We have reduced the cost of training multiple linear regression on blockchain
network by 100 times through layer two blockchain scalability solution. See
table 2.
4 Literature Review
Blockchain For AI: Blockchain provides enhanced data security for storing
sensitive information in diskless environments [13]. The data in the blockchain
is digitally signed, ensuring the security of data for AI and enhancing trust. In
healthcare systems, for instance, blockchain helps AI with cryptographic secu-
rity protocols that protect patient data and make a graph database of patient
healthcare systems[14]. Ethical concerns and privacy of patient data are two of
the main problems when an artificially intelligent application analyzes patient
data[15]. Blockchain secures the privacy of the patient data with private key and
public key combinations [16] that do not reveal the patient’s identity. In addi-
tion, blockchain provides automation features that are missing in machine learn-
ing applications, eventually improving performance [17]. Such applications are
used for fraud detection in financial transactions. Whenever data is exposed to a
private authority, it is at risk of exposure depending on the organization’s inter-
est. Blockchain helps machine learning applications build a privacy-preserving
model for its prediction technique [18]. However, protecting data for integrity
and privacy does not guarantee trust in the training and prediction of machine
learning models.
Blockchain Integrated AI Applications:DeepBrainChain [19] is one of
the first frameworks in the industry to run artificial intelligence platforms with
blockchain technology. The project reduces the cost of AI tasks with the help of
distributed resources and shares the computing load with decentralization but
fails to protect AI applications. CortexAI [20] is a decentralized AI platform
that trains machine learning models offline and predicts online to incentivize
the developers and providers of the service. However, online prediction does not
use smart contracts and hence lacks trust. Algorithmia [21] developed a Danku
project that allows anyone to post a dataset and ML model for evaluation and
incentivize the model owners. When we train the model outside the blockchain,
the data and model become vulnerable to threats and may not be trusted, thus
making the platform susceptible to poisoning attacks. Liu et al. [23] discuss the
advantage of collaboration between ML and blockchain technology that aids net-
4 S. Badruddoja et al.
work and communication systems. In this work, blockchain facilitates training
data and a sharing model for decentralized intelligence. ML applications can uti-
lize blockchain in communication and networking systems to provide security,
scalability, and privacy in intelligent smart contracts. Such promising integration
requires the blockchain application to run machine learning algorithms for pre-
diction or classification on distributed ledger platforms. However, the attempt
to secure machine learning with blockchain remains unexplored.
Smart Contract Limitations : Solidity suffers from floating point arith-
metic operations as they do not allow float division, signed exponents, and other
float operations. Fixidity [25], ABDK [26], PrbMath [34], and Decimalmath [35]
are some of the libraries trying to implement fixed-point equivalent outcomes,
but these libraries increase the transaction cost of float operations along with the
integer overflow problem which makes the libraries unreasonable for a training
model.
Smart Contract trains data
with regression algorithm
Layer 2 Blockchain
Smart Contract predicts
with regression model
Layer 1 Blockchain
Input
Layer
Training Data (Labeled)
DApp
Layer
Test Data (Unlabeled)
Consensus
Layer
Blockchain nodes Consensus on transactions
Training
Integrity
Trusted
Prediction
Explainable
Model
Affordable
Training Cost
Fig. 1: Design of smart contract-based blockchain application for training mul-
tiple linear regression algorithm to optimize learning parameters for prediction
purpose
5 Methodology
Design Approach: We developed a fraction-based numerical computation
to train a linear regression model with smart contracts to assure integrity, prove-
nance, and trust. We train the model with an optimistic roll-up approach( a
layer two scalability method) and predict using the developed model on the
blockchain( a layer one blockchain). The iterative optimization of the multiple
linear regression method has a gradient descent-based learning approach that
learns the parameter with a constant learning rate. The model trained with a
smart contract on the Blockchain network produces consensus-based transac-
tions which are highly trusted. Blockchain network and distributed file system
together provide a provenance capability of tracing the model. Moreover, this
approach also provides explainability of decisions made by the machine learn-
ing model that can be updated with a required correction. Figure 1 shows a
high-level design of our proposed work. The input layer consists of the dataset.
The DApp layer consists of smart contracts deployed on blockchain for training
Scalable Smart Contracts for Linear Regression Algorithm 5
and prediction. The consensus layer computes transaction outputs with verified
results.
Layer Two Blockchain for Training: Blockchain provides scalability solu-
tions with an optimistic roll-up, zero-knowledge roll-up, sharding, and sidechains
[8]. Although sharding and side chains are layer one scalability solutions of
blockchain networks where the transaction delay is similar to the Ethereum
network, they provide cheaper transactions. Zero-knowledge roll-up is a layer
two blockchain scalability solution that produces complex cryptographic proofs
that make the computations highly complex, resulting in inexplicable AI. On the
other hand, optimistic roll-up (a layer two scalability solution) assumes that the
miners are honest and will produce cheaper transactions with faster outputs,
which is ideal for training machine learning algorithms. We chose optimistic
roll-up as the blockchain network for training the linear regression algorithm
and compared the performance with the Ethereum test network.
Optimistic roll-up for Off-chain Training: Optimistic roll-ups execute
transactions parallel to Ethereum main chain. After all the transactions are
complete, the last state change is stored on the main chain[8]. This increases the
speed of transactions from 10 to 100 times. "Optimistic" refers to the aggregate
of bare minimum information required to be stored without proof, assuming
no fraud is committed. Optimism and Arbitrum are two of the platform that
implements optimistic roll-ups with layer two blockchain solutions. The proof is
provided only when fraud is committed.
Multiple Linear Regression: Multiple linear regression involves learning
multiple parameters to form a line of the equation that can best fit a model [10].
Equation 1 shows the prediction formula for linear regression where we have
to learn and optimize weights and biases which are W1, W2, .....W n and c. The
learning of parameters is performed through the iterative optimization method.
The same equation is referred to as y(referred to as yhat) for training purposes.
The yis computed repeatedly with updated weights and biases. Multiple linear
regression implementation can be detailed at [10]. The next section details how
we have implemented iterative optimization with smart contracts.
y=W1x1+W2x2+W3x3..... +Wnxn+c(1)
Event Flow: Figure 2 shows the event flow of our proposed model, where
AI application developers access our smart contracts to train linear regression
models on the blockchain network. Later, an AI user access the prediction smart
contract to predict the desired outcome through the blockchain network.
Iterative Optimization: The iterative optimization model is implemented
with a fraction-based computation to ensure that Solidity smart contract can
execute those functions on the Ethereum Platform. Iterative optimization in-
volves 3 steps. Step 1 is to compute ywith random weight and bias parameters.
From step 1, we get the mean square error between yand the actual yvalue.
Step 2 involves the derivative computation of weights and biases annotated as
delta_weight and delta_bias concerning the mean square error. Lastly, step 3
involves updating weights and biases with learning rate α[10].
6 S. Badruddoja et al.
Training
Smart Contract
Prediction
Smart Contract
Blockchain
Network
AI
Application
Developer / User
Developer access contract
for training
Train dataset on
blockchain network
Return immutable
trained model
User requests prediction
through smart contract Predict on
blockchain network
Return trusted
prediction result
Fig. 2: Event flow of regression model development through smart contracts
where training a model and prediction are computed on blockchain networks.
get_y_hat =y_num
y_den =weight_num
weight_den ∗x_num
x_den +bias_num
bias_den (2)
get_y_dif f = ( y_num
y_den
−yi_num
yi_den )(3)
get_delta_weights =δw_num
δw_den =1
N
n−1
X
i=0
2∗xi_num
xi_den (get_y_dif f )(4)
get_delta_bias =δb_num
δb_den =1
N
n
X
i=0
2∗(get_y_dif f )(5)
weight_num
weight_den =weight_num
weight_den −α_num
α_den .δw_num
δw_den (6)
bias_num
bias_den =bias_num
bias_den −α_num
α_den .δb_num
δb_den (7)
Fraction Transformation for Multiple Linear Regression: We have
obtained a fraction-based computational method from the standard iterative op-
timization method that transfers decimal numbers into a fraction for performing
iterative optimization in the solidity smart contract. The equation 2 computes
yvalues multiplies weights with f eatures and adds biases. The yis repre-
sented with y_num/y_den (numerator/denominator). Equation 3 subtracts
true yvalue from y. The equation 4 computes the gradient descent derivative
of weights concerning the difference between the true values of training data
and yvalues that are computed with equation 2. Similarly,equation 5 provides
the derivative of bias. After completing all the derivative computations the new
parameters are updated with equation 6 and equation 7. All the parameters are
computed in fractions with numerator and denominator terms.
Scalable Smart Contracts for Linear Regression Algorithm 7
6 Experimental setup
We have considered two datasets for testing our hypothesis. The two datasets
are "Diabetes progression" [37] and "Real state valuation" [38] with 442 and 414
samples. The "Diabetes Progression" dataset provides a quantitative measure of
diabetes progression concerning age, sex, body mass index, average blood pres-
sure, and six blood serum measurements for 442 diabetes patients. The "Real
state valuation" dataset provides price estimates of real estate concerning trans-
action date, house age, distance to nearest meter station, number of nearby
convenience stores, latitude, and longitude with geographical coordinates. These
datasets have categorical and continuous variables. Data are pre-processed with
label encoders to convert categorical values to continuous variables for smart
contract inputs. Moreover, we have deployed the training smart contract on
layer two blockchain network (Arbitrum) [36]. The prediction smart contract is
deployed in the Ethereum Ropsten test network. To build a comparable analy-
sis, we deployed a linear regression model with the scikit-learn library to record
baseline performance accuracy. Scikit-learn[37] provides a set of python standard
libraries for various AI algorithms.
7 Performance Evaluation
Prediction Accuracy :The table 1 provides mean square errors in the pre-
diction of all the datasets for smart-contract-based prediction and python-based
prediction. The prediction error of diabetes and real-state cost are close for both
smart contract and python deployment in Table 1, which ensures the reliability
of training with smart contracts.
Dataset MSE in Smart Contract MSE in Python
Diabetes Progression 2865 2900
Real State Cost 72.67 1.00
Table 1: Mean square error comparison between smart contract-based and
python scikit-learn based prediction
The figure 3 and 4 provide the comparison of ground truth, library prediction,
and smart contract prediction values for the test datasets of diabetes progres-
sion and real state costs. The graph in figure 3 shows that the scikit-learn-based
prediction and smart contract-based predictions are converging. The accuracy of
prediction is approximately 95% to the python-library-based function. Further-
more, the graph in figure 4 provides another convincing prediction result close
to the ground truth that confirms that the mean square error is low.
Cost of Smart contracts Functions: The smart contract transaction cost
for Ethereum Rospten test network follows the formula transactioncost(Ethers) =
gas_used ∗(gaspr ice +basefee))/109Gwei [8]. We have plotted the get_y_hat
function (another name for y) in figure 5 with y-axis showing rise of cost in
GWei and x-axis as number of features. It is clear from the graph that the cost
of computing yforms a linear relationship with a number of features and is
predictable. The cost computations of the remaining functions are plotted in
figure 6. The cost of transaction for get_delta_w, and get_y_diff similarly
forms linear relationships with rising number of samples as shown in figure 6.
8 S. Badruddoja et al.
0 20 40 60 80
Sample Data
50
100
150
200
250
300
Diabetes Progression
True Progression
Smart Contract Predicted Progression
Scikit-learn Predicted Progression
Fig. 3: Comparison of true sorted diabetes progression with predicted progression
for scikit learn library and blockchain smart contracts. The scikit-learn and smart
contracts produced similar progression estimates.
0 20 40 60 80
Sample Data
0
10
20
30
40
50
60
Price of Unit Area in Dollar
True Price
Smart Contract Predicted Price
Scikit-learn Predicted Price
Fig. 4: Comparison of true sorted real state valuation (price) with predicted
valuation(prices) for scikit learn library and blockchain smart contracts. The
scikit-learn and smart contracts produced similar progression estimates.
The slope of get_delta_wis greater than get_y_dif f due to the higher number
of computations involved in calculating weight parameters.
The table 2 provides the comparative analysis of the cost of training a sin-
gle iteration for 353 samples on Rospten and Arbitrum networks. The cost of
get_y_hat computation is the highest among all the functions for the Ethereum
Ropsten network. It computes the yvalues for all the samples. The number
of function execution is equivalent to the number of examples in the training
dataset. Linear regression requires more than 1000 iterations to optimize weights
with iterative optimization methods, and the cost of yfunction is 2587 ethers,
US dollar equivalent to 50,590,801, which is drastically high. Conversely, the
Arbitrum network seems to reduce the price more than 100 times and make the
Scalable Smart Contracts for Linear Regression Algorithm 9
20 40 60 80 100
Number of features
500000
1000000
1500000
2000000
2500000
Gas Used in GWei
get_y_hat : y = 37503* x + 104514
Fig. 5: Shows the cost of computing get_y_hat function on Ropsten test network
for a rising number of data samples. The relationship is linear for the x-axis and
y-axis, with a slope value of 37503.
100 200 300 400 500
Number of samples
5000000
10000000
15000000
20000000
25000000
30000000
Gas used in GWei
get_y_diff | y = 50759*x + 97720
get_delta_w | y = 37611*x + 109578
Fig. 6: Shows the cost of computing get_y_dif f, get_delta_weights, g et_delta_bias
functions on Ropsten test network. The slope of get_delta_wis greater than
get_y_dif f due to the higher number of computations involved in calculating
weight parameters.
Function Name Ethereum
Cost
Ethereum
Time(Seconds)
Arbitrum
Cost
Arbitrum
Time(Seconds)
get_y_hat 2.587768 4500 - 6000 0.0066046 300
get_y_dif f 0.31607 3-25 0.00080997 1-2
get_delta_w0.17934 2-30 0.0033095 1-2
get_delta_b0.00879 3-25 0.00016627 1-2
get_new_weights 0.00451 3-26 0.00000573 1-2
get_new_bias 0.00072 2-34 0.00000571 1-2
Table 2: Training cost of a single iteration of multiple linear regression for
Diabetes progression dataset with 353 samples of training on Ropsten Ethereum
Network and Arbitrum network.
cost of training a model on a blockchain platform more affordable. The cost of
10 S. Badruddo ja et al.
training the model with 1000 iterations would be approximately 10.016 Arbitrum
ethers, equivalent to 0.0033 USD (1 Arbitrum ether = 0.0002933 USD).
Function Decimal Flops Fraction Flops
get_y_hat 2nm 6nm
get_delta_w2nm 6nm
get_delta_bn 4n
Table 3: shows a rise in the number of operations for fraction-based compu-
tations for smart contract functions while compared with non-fraction-based
computations.
8 Computational Analysis
The number of lines of code and operations in blockchain smart contracts
is crucial for the computational cost of functions defined underneath. We have
analyzed the difference in the number of computations between fraction and
non-fraction-based calculations. We have considered Watkins’s "Fundamental of
Matrix Computation" book [30] for the analysis of the number of operations
involved in our application. Considering a training dataset of n*m matrix (with
n rows and m columns), table 3 shows that the number of computations will
increase by approximately 3-4 times when calculations are performed with frac-
tions compared to decimal counts. This rise in the count will impact the cost of
smart contract functions.
9 Limitation and Challenges
Block Gas Limit: The gas limit for the Ethereum Ropsten transaction reached
block capacity to compute a higher number of computations to develop a linear
regression model with a smart contract. For Ropsten, the default gas limit is
a hard limit of 4712388 GWei [31] as current information. Due to the number
of iterations involved in computing the entire set of functions, the gas capacity
could not be controlled. The main Ethereum network has a default block gas
limit of 15,000,000 Gwei[32] and can be increased to 30,000,000 Gwei. We assume
that the Ethereum main network will allow a higher number of computations
due to the higher block gas limit.
10 Conclusion
Smart contracts do not allow floating-point computations for linear regres-
sion algorithms. This hinders the development of the AI model in blockchain
networks. We have proposed a novel approach to develop a trustable machine
learning model with the help of blockchain technology and make the artificially
intelligent application more secure. Our work also shows that static smart con-
tracts can be transformed into learning smart contracts by running machine
learning algorithms inside the blockchain network. We have deployed a smart
contract with a multiple linear regression mechanism to train our models on
blockchain and achieved excellent training accuracy concerning mean square er-
ror computation. We have also achieved good prediction accuracy for the model
learned on-chain. Moreover, our solution minimizes the cost of training linear
regression algorithms using optimistic roll-up (layer two blockchain). We have
analyzed the cost of training a machine learning model and showed that the
Scalable Smart Contracts for Linear Regression Algorithm 11
optimistic roll-up saves the training cost by more than 100 times. In the fu-
ture, we aim to develop more AI algorithms using smart contracts with further
investigation and analysis.
References
1. Bangbit Technologies, " Introduction to Artificial Intelligence
(AI): A Deep Dive into Machine Learning Deep Learning",
https://medium.com/@BangBitTech/introduction-to-artificial-intelligence-ai-
a-deep-dive-into-machine-learning-deep-learning-4763e6985344, August 2019.
2. A.C. Bantis , "Is your ML model Secure", https://medium.com/slalom-
technology/is-your-ml-model-secure-fe10b8589b71,Accessed September 2021.
3. N. Pitropakis et al. ,"A Taxonomy and Survey of attacks
against machine learning",Volume 34,2019,100199,ISSN 1574-0137,
https://doi.org/10.1016/j.cosrev.2019.100199.
4. Liao et al., (2021, May). Introduction to explainable AI. In Extended Abstracts of
the 2021 CHI Conference on Human Factors in Computing Systems (pp. 1-3).
5. Kale et al. (2022). Provenance documentation to enable explainable and trustwor-
thy AI: A literature review. Data Intelligence, 1-41.
6. U. Pawar, D. O’Shea, S. Rea and R. O’Reilly, "Explainable AI in Healthcare,"
2020 International Conference on Cyber Situational Awareness, Data Analytics and
Assessment (CyberSA), 2020, pp. 1-2, doi: 10.1109/CyberSA49311.2020.9139655.
7. C. Kastner, "Versioning, Provenance, and Reproducibility in Production
Machine Learning", https://ckaestne.medium.com/versioning-provenance-and-
reproducibility-in-production-machine-learning-355c48665005, February 2021
8. Ethereum, W. (2014). Ethereum Whitepaper. Ethereum. URL: https://ethereum.
org [accessed 2020-07-07].
9. Parizi et al., (2018, June). Smart contract programming languages on blockchains:
An empirical evaluation of usability and security. In International Conference on
Blockchain (pp. 75-91). Springer, Cham.
10. J. Neto, "Multiple Linear Regression from Scratch using Python",
https://medium.com/analytics-vidhya/multiple-linear-regression-from-scratch-
using-python-db9368859f, August 2021.
11. A.B. Shafiq, "Which methods should be used for solving linear regression?",
https://www.kdnuggets.com/2020/09/solving-linear-regression.html,
12. K. Salah et al., "Blockchain for AI: Review and Open Research Challenges," in
IEEE Access, vol. 7, pp. 10127-10149, 2019, DOI: 10.1109/ACCESS.2018.2890507.
13. Y. Liu et al., "Blockchain and Machine Learning for Communications and Net-
working Systems," in IEEE Communications Surveys Tutorials, vol. 22, no. 2, pp.
1392-1431, Second quarter 2020, DOI: 10.1109/COMST.2020.2975911.
14. D. Campbell, Combining AI and Blockchain to push Frontiers in Health-
care , https://www.macadamian.com/learn/combining-ai-and-blockchain-in-
healthcare/, Accessed
15. B. Ivana,AI in healthcare: Ethical and privacy challenges,Book =
Conference on Artificial Intelligence in Medicine in Europe,pages 7-
10,year=2019,organization=Springer
16. R.Kumar, R.Tripathi,Secure Healthcare Framework Using Blockchain and Public
Key Cryptography,2020
17. T. Wang, "A Unified Analytical Framework for Trustable Machine Learning and
Automation Running with Blockchain," 2018 IEEE International Conference on
Big Data (Big Data), 2018, pp. 4974-4983, DOI: 10.1109/BigData.2018.8622262.
12 S. Badruddo ja et al.
18. H. Kim, S. Kim, J. Y. Hwang and C. Seo, "Efficient Privacy-Preserving Machine
Learning for Blockchain Network," in IEEE Access, vol. 7, pp. 136481-136495,
2019, DOI: 10.1109/ACCESS.2019.2940052.27.
19. J. Zou et al.,“DeepBrainChain: Artificial Intelligence Computing Platform Driven
By Blockchain”, White paper, https://cryptorating.eu/whitepapers/DeepBrain-
Chain/DeepBrainChainWhitepaper.pdf , Accessed November 2021
20. Z. Chen, W. Wang, X. Yan, J. Tian, “Cortex-AI on Blockchain- The
Decentralized AI Autonomous System”, white paper, HTTPS:// crypto
rating.EU/whitepapers/Cortex/Cortex_AI_on _Blockchain_EN.pdf, Accessed
November 2021
21. A.B. Kurtulmus, K. Daniel, “Trustless Machine learning Contracts ; Eval-
uating and Exchanging machine learning Models on Ethereum Blockchain”,
ARXIV:1802.10185v1, https://arxiv.org/pdf/1802.10185.pdf
22. J. D. Harris and B. Waggoner, "Decentralized and Collaborative AI on
Blockchain," 2019 IEEE International Conference on Blockchain (Blockchain),
2019, pp. 368-375, DOI: 10.1109/Blockchain.2019.00057.
23. Y. Liu et al., "Blockchain and Machine Learning for Communications and Net-
working Systems," in IEEE Communications Surveys Tutorials, vol. 22, no. 2, pp.
1392-1431, Second quarter 2020, DOI: 10.1109/COMST.2020.2975911.
24. "Solidity Programming guide",https://docs.soliditylang.org/en/v0.8.9/ ,Accessed
September 2021
25. Fixidity fixed point library for Solidity, https://github.com/CementDAO/Fixidity,
Accessed November 2021
26. ABDK library for Solidity, https://github.com/abdk-consulting/abdk-libraries-
solidity/blob/master/ABDKMath64x64.sol, Accessed November 2021.
27. Ethereum White Paper, "Scaling", https://ethereum.org/en /develop-
ers/docs/scaling/, April 2022
28. Kalodner, H., Goldfeder, S., Chen, X., Weinberg, S. M., Felten, E. W. (2018).
Arbitron: Scalable, private smart contracts. In 27th USENIX Security Symposium
(USENIX Security 18) (pp. 1353-1370).
29. Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O.,
... Duchesnay, E. (2011). Scikit-learn: Machine learning in Python. the Journal of
machine Learning research, 12, 2825-2830.
30. Watkins, Fundamentals of Matrix Computations,
https://davidtabora.files.wordpress.com/2015/01/david_s-
_watkins_fundamentals_of_matrix_computat.pdf
31. H. Moriya, How to get Ethereum Block Gas Limit,
https://piyopiyo.medium.com/how-to-get-ethereum-block-gas-limit-eba2c8f32ce,
Accessed December 2021
32. D. Notik, Ethereum , https://ethereum.org/en/developers/docs/gas/, Accessed
December 2021.
33. Project Implementation, " Github Source", https://github.com/syber2020/LR-
KNN-6950-FA21/tree/master/LR-Python-Web3/MLR
34. "PRBMath library" , https://github.com/paulrberg/prb-math, Accessed July
2022
35. "Decimalmath",https://github.com/alcueca/DecimalMath, Accessed July 2022
36. Kalodner, H., Goldfeder, S., Chen, X., Weinberg, S. M., Felten, E. W. (2018).
Arbitrum: Scalable, private smart contracts. In 27th USENIX Security Symposium
(USENIX Security 18) (pp. 1353-1370).
Scalable Smart Contracts for Linear Regression Algorithm 13
37. Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O.,
... Duchesnay, E. (2011). Scikit-learn: Machine learning in Python. the Journal of
machine Learning research, 12, 2825-2830.
38. Yeh, I. C., Hsu, T. K. (2018). Building real estate valuation models with compara-
tive approach through case-based reasoning. Applied Soft Computing, 65, 260-271.