ant build file for compressing JSON output

Let us know if you have a nice R.U.B.E script to share!
Post Reply
tescott
Posts: 68
Joined: Wed Feb 06, 2013 6:32 pm

ant build file for compressing JSON output

Post by tescott » Wed Jul 17, 2013 2:00 am

The RUBE JSON output is still a bit too big for me, even with the latest changes from spaces to tabs. I figured the following build.xml / ant script might help someone else out there. You'll need to grab jq from https://github.com/stedolan/jq.

Example:
TutHazards.json (using hex floats)
original size=221k
jq compact size: 119k

You obviously will need to update some of the settings to match your file list and installation locations.

--tim

Code: Select all

<project name="levelcrunch" default="all" basedir=".">
    <property environment="env" />
    <property name="jqloc" value="d:/projects/tools/jq"/>
    <property name="jq" value="${jqloc}/jq"/>
    
    <target name="compact">
        <apply executable="${env.ComSpec}" relative="true">
            <arg value="/c" />
            <arg line="${jq} --compact-output "." <"/>
            <srcfile />
            <arg line=">" />
            <targetfile />
            <fileset dir=".">
                <patternset>
                    <!-- List any levels for conversion here! -->
                    <include name="Tut*.json" />
                </patternset>
            </fileset>
            <mapper type="glob" from="*.json" to="../../baublebird-android/assets/data/levels/*.json" />
        </apply>
    </target>
    
    <target name="all" depends="compact" />
</project>

tescott
Posts: 68
Joined: Wed Feb 06, 2013 6:32 pm

Re: ant build file for compressing JSON output

Post by tescott » Wed Jul 17, 2013 11:29 pm

Well.... RUBE v1.4.0 allows you to completely pack things, so jq is not needed at all. Just go to Scene > Scene Settings... > and uncheck the "Use indentation" option. Cool!!!

--tim

iforce2d
Site Admin
Posts: 860
Joined: Sat Dec 22, 2012 7:20 pm

Re: ant build file for compressing JSON output

Post by iforce2d » Thu Jul 18, 2013 4:17 am

Right. Actually seeing this thread tipped the balance for me... I began to think, it won't take so long to do that, and it really should have been an option to begin with.

tescott
Posts: 68
Joined: Wed Feb 06, 2013 6:32 pm

Re: ant build file for compressing JSON output

Post by tescott » Sun Jul 21, 2013 7:21 am

I'm probably getting pedantic here, but I was curious to see how the various size reducing methods stacked up and wanted to post those results here. Additionally, I've come up with another method for reducing .JSON size that gets things even smaller yet... at the expense of a guaranteed precision loss.

My test file is one of the levels I've developed for the game I'm working on. My scene stats script reports:

01:52:45: Scene statistics
01:52:45: ------------------
01:52:45: Body count: 3
01:52:45: - static: 3
01:52:45: - kinematic: 0
01:52:45: - dynamic: 0
01:52:45: Fixture count: 15
01:52:45: - Sensor: 1
01:52:45: Joint count: 0
01:52:45: Image count: 8

Doesn't seem like much, but I've got polygon radius set to 1 for all the fixtures which yields 391 fixtures at run-time.

My new size reduction method is somewhat similar to the compact method. Since my platformer style game doesn't really require all that much precision, I created a perl script that simply lops off floats after 3 decimal places.

Example: 0.2000000029802322 translates into 0.200

For my test input, I use the .json produced with the human readable and compact options set.

Here's the perl script invocation:

Code: Select all

perl -pi -e 's/([-+]?[0-9]*\.+[0-9]+)/sprintf("%.3f",$1)/ge' TutHowToFlyTallhumc.json
Given that, here are the results I'm seeing:

Original rube file: 37457
Human readable, compact off: 132719
Human readable, compact on: 126410
Hex floats: 115176
Human readable, compact on, perl processed: 104435

I don't have any solid numbers as far as load times go. My mobile deployment currently uses hex/floats which takes a few more operations per float... I'm curious to learn if there's any noticeable improvement using the perl processed file. My gut tells me it would only become apparent for extremely large files. At a guess, the current hex/float load for this file takes roughly 3 seconds.

Thought others might be interested.

If / when I take some timing measurements, I'll follow-up here.

--tim

Post Reply