Skip to content

Tags: Offroaders123/NBTify

Tags

2.2.0

Toggle 2.2.0's commit message
VarInt Fixes 🚀

Bringing the config for the project up.

Tried to add support for isolatedDeclarations, but it wasn't too different for things, and my tests aren't too friendly for it yet, so I'm not going to do it for this one yet.

I think I actually should have a separate tsconfig for the tests folder come to think of it, that would work good.

#53

2.1.0

Toggle 2.1.0's commit message
CDN Bundle Docs + Strictless Output

Updated the dependencies overall, and am going to publish these changes to npm, mainly for being able to try out the new strictless handling.

The better CDN docs is a much needed improvement too, I hadn't pushed that just yet.

archive/exp-adjacent-reading

Toggle archive/exp-adjacent-reading's commit message
Adjacent NBT Reading

Ziltoid the Omniscient!!!

#31
#51

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/Array/fromAsync

This is a demo of this feature, I'm not sure if this is completely how I will implement it, but it does seem to be working fairly okay.

The `chunk91` file is from Dexrn, in the project to parse CDB entries from New 3DS Edition. It seems to be relatively the same format as that of the equivalent format section in Bedrock's LevelDB implementation. That's where my `BlockEntity` file is from too, by Bedrock-LevelDB project.

2.0.0

Toggle 2.0.0's commit message

archive/exp-varint

Toggle archive/exp-varint's commit message
Exp-Varint Support!

It's working!!! Took a few different sessions, having new eyes on the changes each time I came back really did help. Got a bit more help from GPT this time too, now that that's a resource we have nowadays. I didn't try that when I looked into implementing this just a few months ago. Now I can add another feature to NBTify's catalog in comparison to Prismarine-NBT!

#34

🤯

archive/exp-smooth-api

Toggle archive/exp-smooth-api's commit message
Smooth API Demo

Looking into restructuring the layout of the public API, both in how you import from it, and how the exported functionalities are named. When I check out how people are using the library, it seems like they don't read the documentation too closely (understandable), so they use extra API calls that don't have to be there. It seems like people want to use `NBT.parse()` more than `NBT.read()`, while one is for binary files, and one is for SNBT files. I can see how they might think `read()` is for reading from the file system, and maybe instead they think `parse()` is to just read arbitrary data. NBTify is meant to be used specifically outside of the file system though, so neither API feature would worry about that. Not all libraries are like that though, so I can see why people think that.

#47

This is a demo I made the other day of how it might make sense to see the public API look. This commit experiments with that as a demo in mind.

```js
// @ts-check

import { Buffer } from "node:buffer";
import * as NBT from "nbtify";

const buffer = Buffer.alloc(25);
const data = await NBT.readBinary(buffer);

NBT.writeBinary; NBT.readString; NBT.writeString;
```

archive/exp-pnbt-converter

Toggle archive/exp-pnbt-converter's commit message

1.90.1

Toggle 1.90.1's commit message

1.90.0

Toggle 1.90.0's commit message
1.90 - Various Upgrades

Lots of changes! Here's a general overview:
- Added `--space` flag to the CLI.
- Removed the `--pipe` flag, in favor of `--nbt`, or `--snbt`.
- CLI logging now displays the whole NBT tree, rather than the default Node behavior which abbreviates deeply nested objects.
- SNBT output now has an EOF newline to match that of use with other terminal tools/tooling/apps.

- Piping into the `nbtify` CLI now works! `cat ./test/nbt/house3.mcstructure | tsx ./src/bin/index.ts --snbt --space=0`, for example.

- SNBT, Windows CRLF Handling

- `Blob` objects can now be opened directly with NBTify functions! No need to convert to `ArrayBuffer` or `Uint8Array` first, it will do that for you.

- Parameter validation functions! Now the internals of the `NBTData` constructor are eventually going to be extracted into their own module, and these will be used on their own across the library, rather than writing the same repetitive logic across the codebase in the same places. This has to do with validating the types that are passed in to NBTify API calls.

- NBT Error Parse Info, now you can keep track of where the parse error occured from outside of NBTify's internals. This can be helpful when you want to test opening a given file, and being able to see what that result is, even though things may have errored and not been complete. (Come to think of it, should I make a way of using `NBTData` like how you use `fetch().ok`, rather than using errors for that? Could be a neat concept, then you don't have to use catch statements with `strict: true` and do all of that jazz.)

- The BedrockLevel header/flag is now a boolean, and it's value is validated from the `StorageVersion` tag in the root of the NBT, now that I have made this distinction that they very much seem to be the same value! So now you both have to provide a `StorageVersion` tag, as well as use little endian, and set `--bedrock-level` to `true`. There's less data duplication in doing that, and it more accurately represents the use of the BedrockLevel header, in that it (as of what I know) only gets used by Bedrock, hence it should only be used in combination with Bedrock's level format, and not work with big endian or things like that.

- Decompressed CLI option, now you can use `null` or `"null"` in conjunction with `--compression=`.

- SNBT Exponential/Scientific Notation: Resolved! Turns out the random sparse errors I was having with various files when parsing them as SNBT, it was because the SNBT parser didn't know how to handle `e+` and `e-` number-based values, and it would try to treat them as strings, even though they were numbers.

- JSON File Support (CLI)! Now you can easily open JSON files, and save them back and forth like you can any other NBT or SNBT file. This is inherently not meant to be a lossless conversion step, but rather one that is just a simple helper for maybe easier data visualizations, as well as just wanting to view a given format as another kind. For the tag types that are the same across each of the formats (like they are mentioned in `compatibility.md`), if your data structure only uses those values which are universally supported, then it is lossless in what data is stored. Say like you are only using numbers, strings, objects, and arrays in your JSON, it can losslessly go back and forth between NBT, SNBT, and JSON, without losing any information.

archive/exp-pkg

Toggle archive/exp-pkg's commit message
Logging Depth / Space / Pipe Flags (merged)

Merge branch 'main' into exp-pkg

Haven't actually looked into what merge commits are, or how they work. I think they are a bigger thing that could be very useful to me. It sounds like it sets a new starting point for the diffs of branches in terms of where future commits are based off of. I think this is absolutely very nice, and it's how I'd hoped it would work. With my limited insight into what merge commits were, I thought merging a branch meant that it was the end of that branch, and that no more work could be done on it. Thankfully, not!

https://stackoverflow.com/questions/57002121/how-to-merge-two-branchesmaster-and-demo-branch-code-in-git
https://www.biteinteractive.com/understanding-git-merge/

"My Philosophy of Discovery"

Listening to my projects in their full lossless form from my local file copies, rather than on Bandcamp through it's player, and they are definitely absolutely much better! I was kind of getting bummed it wasn't as clear as it seemed when I was making it. But the source files are plenty clear, and I'm still happy with how those sound. And "Cuatro de la Manana" seemed like it was missing something for not having a bass track when listening to it on Bandcamp, but it still sounds great when listening to the full-quality version. I think I didn't end up recording one for it because I didn't want to crowd the mix with the already bigger overdub overlap, and now I think again that it was the right decision for that song. I think my upcoming next goal is to properly finish the artwork for my other albums, so I can get them to YouTube, and also download all of the final copies for the songs, so I can properly add them to my own music library. I'm glad I went with the limited song list for "so, many, bad, decisions" as well, it really brings up the song strength of the album. I think this and "Flatlands" are probably my two best albums to date, I think. I'm really happy with how far things have come, and I only want to continue towards raising the bar on the quality of my work, and still continue trying new things. The next album has the hypothetical concept of being "jazzy death metal", but I don't want to force things out just to accomplish that. I want it to be however it happens.

Watched this the other day as well, a great one again! Always things to learn from the Hevy Devy podcasts. The things brought up always tend to be things similar to what I'm dealing with as well, and it comes at the perfect time to equate with my own scenarios.

https://www.youtube.com/watch?v=1SHtIbmzsBw

On the "jazzy death metal" part, I am also going to try starting to write my own lyrics for things, proper lyrics as well, since all of my other attempts at it thus far have mostly been off the cuff ones, and it gets a bit messy. That's all part of the fun, I really like doing that. It is a new avenue to write something full and complete going into it though, I think lyrics may be a nice way at helping work on that. I struggle with going through on completing a whole idea that I had at the start of a new concept. Usually it turns into something new by the time I'm done with it. That's okay as well, and that's probably what's going to happen with the lyrics to be honest haha.

"Rotten Swine Ligament"