Write "geth" router and bash scripts to download and install extremely large tarballs

Cerrado Publicado hace 2 años Pagado a la entrega
Cerrado Pagado a la entrega

This project consists of 2 major components:

TL;DR:

1. Write a simple nodejs router script that will route ethereum's JsonRpc protocol to different backends based on the request.

2. Setup a bash script that will install, configure and startup individual `geth` instances based from s3 that #1 will be routed too.

Details:

1.

I have an archive node that is split into slices. Each slice has a part of the Binance Smart Chain (it's the same as Ethereum) that lives on S3. For example:

[login to view URL] might have blocks 0 to 389784

[login to view URL] might have blocks 389681 to 500000

A custom websocket router is need to route a single endpoint to one of these backends based on what block is being requested. The JSON request will have the block id in the request itself. In the event there is no block being requested it should route to the instance that has the highest block number available.

2.

S3 has the giant tarballs that contain the archive data. The layout might look like:

s3://some-bucket/[login to view URL] - Might have blocks 1 to 40000

s3://some-bucket/[login to view URL] - Might have blocks 39904 to 80000

... exc

A configurable script needs to be created that will read the AWS EC2 instance's attached tags on startup and based on the values in the tag(s) download, setup and start a `geth` service. These tarballs are quite large about 8TB total and mostly split into ~750GB tars. Each tarball is independent and should be kept that way and will be used as the service that will be available to #1.

There may be more than 1 `geth` service running on a single instance at a time serving different parts of the block chain. This is important for how #1 will be tied in.

I do not want to have to hand hold you through this project, so you MUST have know how to work with the following technologies:

* nodejs

* websocket

* bash

* AWS S3

* AWS CLI

* AWS EC2

I REQUIRE EXTREMELY HIGH QUALITY CODE. I will be reviewing all the code with extreme scrutiny. Each code review will often take 2-5 iterations before I will approve it. I will not accept any commits to the repository that is greater than 800 lines. If you are unable to make small commits that work towards a larger project do not apply.

If you apply for this, please provide a github repo or another location where I can look at some of the code you have written (extra points if it's an opensource project). I will not accept any offers from people that will not share their public github user or if the public github user is newer than 2 years old.

Linux Node.js Bash Scripting Ethereum Cadena de bloques

Nº del proyecto: #30548519

Sobre el proyecto

5 propuestas Proyecto remoto Activo hace 2 años

5 freelancers están ofertando un promedio de $1250 por este trabajo

ArkssTech

Hey Manager!! We checked for your project and interested in your project. Project: Write "geth" router and bash scripts to download and install extremely large tarballs I am skilled full stack developer with skills in Más

$1375 USD en 19 días
(11 comentarios)
5.8
markverenich103

★★★★★ You would be succeed!!! ★★★★★I really want to be contributed to letting your vision come true and have such great ability and proficiency. I think you want to build highly qualified, eye-catching website based on Más

$1125 USD en 7 días
(0 comentarios)
0.0
kosticdev116

dear my client. This job is serious and in my opinion it should not be awarded to anyone. As a professional, I have the experience and ability to do this job perfectly, on time and at the right price. I would like to t Más

$1500 USD en 5 días
(0 comentarios)
0.0