Hive/Splinterlands Scaling Solutions and Project Blank

avatar

One doesn't have to be an expert in Hive blockchain to understand the recent surge in activity mainly due to Splinterlands has posed some unexpected scaling issues.

Quick Scaling Fixes and More Long Term Plans

Some of these issues have mainly been tackled (source 1: Blocktrades post, source 2: CryptoManiacs Interview with Blocktrades), with more optimizations for the "write" operations (broadcasting transactions to the blockchain) likely to be included in the package for hard fork 26 (source 2).

Since dApps - and especially Keychain - have been updated to use asynchronous transactions broadcast instead of synchronous ones (i.e. unblocking instead of blocking), Hive shifted to a higher speed gear. Have you remarked it? And I don't mean compared to the time of the bottlenecks, I mean compared to before. Just go ahead and vote something on PeakD or whatever. See how quick it is now! :) And I haven't seen any issues starting the matches on Splinterlands either since then.

crowd1584115_1280.jpg
Source

Hive Application Framework (HAF) will be released very soon by Blocktrades team and will also include a number of optimizations and possibly will allow to avoid some of the issues we experienced during the past few weeks in the future, because dApp developers who will use it will likely not have to worry that they might use legacy or non-performant options when coding.

Not Everything Should Be On the Core Level Blockchain

Still... with all the optimizations at the core level, anything that is written to or read from a blockchain takes time.

That was the reason why Hivemind was created in the first place. To act as an intermediary between the blockchain and various dApp which use it, being far more responsive than the blockchain could be.

That is why Leofinance team created LeoInfra on the backend, to act as a caching mechanism that allows a smoother experience for the user even when the blockchain lags. LeoInfra creates a backlog of operations that need to be broadcasted to the blockchain and tries to do it until it succeeds or fails, but the user can go on and do something else in the meantime.

I see Matt from Splinterlands, after the bad experience of the scaling issues they had to battle with while the game has become increasingly popular (as in exponentially more), wants to take a more drastic approach to prepare for even more traffic: stop broadcasting some types of transactions to the blockchain altogether and make them go directly to the Splinterlands website instead.

The issue is correct in my opinion. I don't know about the chosen custom_jsons that will be taken out, if any of them is critical in determining the result of the match. If it is, that's a problem that needs addressing.

Thinking Ahead

But when we talk about thinking ahead a couple of multiples of growth, at some point, with all the optimizations, the blockchain will not handle a high enough simultaneous activity writing to the base layer non-essential information. This is the same reason
why Blocktrades wants smart contracts on the second layer and not on the first layer (one of the reasons, anyway - watch the interview, nice metaphor there with your computer and its operating system).

So why did I mention Project Blank in the title? Well, simply because that's another dApp that will have a significant impact on the blockchain activity, especially if it becomes popular outside Hive too.

It's likely Khal will use his LeoInfra backend for the microblogging app as well, but in the end all short form messages will end up on the chain.

I suppose it depends a lot how he organizes his new data structure he said he designed for Project Blank. Because it's one thing to write to the chain short comments individually, it's another to write a larger data structure at once. This, on the other end, can end up using more processing power/memory to add together/break into pieces. Just throwing ideas here, I don't really know the direction the project is heading.

Final Words

Scaling capability of Hive should not be taken for granted and should not be abused. There isn't any blockchain in the world that can scale indefinitely and little shocks like we had these few weeks should be a reminder.

Posted Using LeoFinance Beta



0
0
0.000
15 comments
avatar

Scaling is very important. It is something that always needs to be thought about. As we discussed with Blocktrades, unfortunately nobody knows how well the scaling is until it is tested with real traffic.

Here was a slight hiccup that has turned into a major step forward as evidenced by the speed of Peakd.

Posted Using LeoFinance Beta

0
0
0.000
avatar

Here was a slight hiccup that has turned into a major step forward as evidenced by the speed of Peakd.

I agree, this real traffic stress test led to the update in that API call throughout the dApps, and that means a significant boost in performance, because I believe most of them used the blocking version.

Posted Using LeoFinance Beta

0
0
0.000
avatar

Yeah Blocktrades said that most of them converted over. The only ones out there, he thought, were some bots and things of that nature.

Most major applications on Hive made the transition I would guess.

Posted Using LeoFinance Beta

0
0
0.000
avatar

He said something additionally. That at least on their side, they created a separate flux to handle the traffic that still uses the blocking API calls, and this way these calls don't create bottlenecks for the rest of the traffic. Other witnesses will likely add this configuration as well, if it's still needed (if there's still a significant amount of blocking traffic).

Posted Using LeoFinance Beta

0
0
0.000
avatar

Yeah that is true. They did break it up so that traffic can flow smoothly. Also the volume under the old calls is much smaller now so less of an issue.

Posted Using LeoFinance Beta

0
0
0.000
avatar

Interesting how these hiccups only turn up when there’s a new event the system is unfamiliar with. One thing I’ve learnt here is that any system can fault when it encounters totally new problems. Just like this, it’s good there was a quick fix.

Posted Using LeoFinance Beta

0
0
0.000
avatar

That's why nothing can substitute 100% the information provided by real systems working, no matter how well they are thought and tested.

Posted Using LeoFinance Beta

0
0
0.000
avatar

Now after reading it, I realized that upvoting is insanely fast.

Posted Using LeoFinance Beta

0
0
0.000
avatar

Yep, I didn't realize either until I heard Blocktrades saying that dApp devs reported improved speeds because of this change. Then, I saw it myself.

Posted Using LeoFinance Beta

0
0
0.000
avatar

I had not realized how faster PeakD is now, but you are right!

It's good to know about all the development that's been taking place

Posted Using LeoFinance Beta

0
0
0.000
avatar

That's a cool side effect of facing some bottlenecks. You are forced to improve! And sometimes you don't even have an idea you have something to improve until a hiccup reveals it to you.

Posted Using LeoFinance Beta

0
0
0.000
avatar

I agree that not every action has to take place on chain. Layer 2 is always an option.

0
0
0.000
avatar

L2 is definitely better for scaling if we're not talking about the core functionality of the chain.

Posted Using LeoFinance Beta

0
0
0.000
avatar

Right. Core functions stay on chain and it's all good.

0
0
0.000