How to represent byte array(or bytes) to decimal? I’m studying how it can, and I found Double Dabble Algorithm with BCD.