Small Software Project -> Logging!
#1
Elite Member
Thread Starter
iTrader: (3)
Join Date: Aug 2006
Location: San Diego, CA
Posts: 3,047
Total Cats: 12
Small Software Project -> Logging!
cjernigan, I have been boosting, gently. I haven't yet installed EBC, and I've got my MBC bypassed, so I'm running on base can pressure- about 5 to 6 PSI. I haven't really romped on it much since the VE table still needs a good dyno session, but I have been boosting mildly. I'm really impressed by how totally seamless the transition into boost is with this setup relative to the EMU- no sudden jump in AFR, just a nice, smooth curve.
Any ideas? They are just .xls, and you have to do a small amount of tweaking (remove/renumber the manual mark/resets, and maybe the timestamps need a touch up.
#4
Elite Member
iTrader: (1)
Join Date: Jun 2006
Location: Warrington/Birmingham
Posts: 2,642
Total Cats: 42
There is a benefit to longer files, but only to a point.
#12
Elite Member
Thread Starter
iTrader: (3)
Join Date: Aug 2006
Location: San Diego, CA
Posts: 3,047
Total Cats: 12
Joe, we can't just be expected to go clicking things at random..... I was a big autotune fan, then I just somehow never used it. :-) The point is I have a lot of logs already in exsistance I want to use - covering a lot of ranges. Also, have you seen the weird things auto-tune/auto-analyze will do sometimes?
Nonetheless, there's a few reasons to do this: Also, it makes a huge log file so I can look for certain 'events of interest'. So in an ideal world, it would bring the marks and resets with it.
Does it? It didn't for me! There really isn't much header, but skipping that line, it seems to get mad about the RESETs and MARKs not being sequential.
Oh, I'm sure there's a point to longer files, more averaging. I hate to see only 15% of my cells change, also.
And, yeah, my logs, even hour long drives, take a few seconds to run (on a decent quad core machine), but even on work's crappy laptop it still is a matter of a couple tens of seconds for a typical 20 minute log at most. You need to figure out why yours is running slow, something is wrong.
Oh! There's an answer I like. I couldn't find it on ten-seconds-at-google, but I'll keep digging.
Interesting, DIY says it's "outdated, use MLV now"
Nonetheless, there's a few reasons to do this: Also, it makes a huge log file so I can look for certain 'events of interest'. So in an ideal world, it would bring the marks and resets with it.
It's a nice idea, but I did a datalog of 30ish miles, (just from home to work) and then used the analysis on my desktopp PC (2.5Ghz dual core 2GB ram etc etc.) and it properly chugged along, took about 20 minutes to analyse the file.
There is a benefit to longer files, but only to a point.
There is a benefit to longer files, but only to a point.
And, yeah, my logs, even hour long drives, take a few seconds to run (on a decent quad core machine), but even on work's crappy laptop it still is a matter of a couple tens of seconds for a typical 20 minute log at most. You need to figure out why yours is running slow, something is wrong.
Interesting, DIY says it's "outdated, use MLV now"
#13
Boost Pope
iTrader: (8)
Join Date: Sep 2005
Location: Chicago. (The less-murder part.)
Posts: 33,457
Total Cats: 6,874
I just stitched two logs together in Excel and MegaLog digested the result with no complaints.
First, I made sure that there was only one header, by copying and pasting only row 3 and down from each subsequent log onto the end of the first log.
Next, I searched for "mark" and deleted all rows containing one.
Last, I found the point at which each new log segment began (I'd previously highlighted the last line in each segment before pasting the next segment in) and adjusted the time (column 1) for each log segment by adding the stated time to a constant which was the last stated time of the previous log. (I didn't touch the 0-255 counter, only the realtime.)
Then I saved the result as text (tab delimited), changed the extension to .xls, and loaded it into MLV.
First, I made sure that there was only one header, by copying and pasting only row 3 and down from each subsequent log onto the end of the first log.
Next, I searched for "mark" and deleted all rows containing one.
Last, I found the point at which each new log segment began (I'd previously highlighted the last line in each segment before pasting the next segment in) and adjusted the time (column 1) for each log segment by adding the stated time to a constant which was the last stated time of the previous log. (I didn't touch the 0-255 counter, only the realtime.)
Then I saved the result as text (tab delimited), changed the extension to .xls, and loaded it into MLV.
#15
Elite Member
Thread Starter
iTrader: (3)
Join Date: Aug 2006
Location: San Diego, CA
Posts: 3,047
Total Cats: 12
I asked the guy who writes MLV and he said he gets the request a lot. Mine's the newest, generally, since it autoupdates. I'd like to make a script/macro to do it, though my excel skills aren't all that. Ideally, re-enumerating the marks, but that's likely too much to ask for. It's just kinda a chore to do it by hand!
-Abe.
-Abe.
#17
Elite Member
iTrader: (11)
Join Date: Jun 2007
Location: Overland Park, Kansas
Posts: 5,360
Total Cats: 43
It's a nice idea, but I did a datalog of 30ish miles, (just from home to work) and then used the analysis on my desktopp PC (2.5Ghz dual core 2GB ram etc etc.) and it properly chugged along, took about 20 minutes to analyse the file.
There is a benefit to longer files, but only to a point.
There is a benefit to longer files, but only to a point.
#18
Boost Pope
iTrader: (8)
Join Date: Sep 2005
Location: Chicago. (The less-murder part.)
Posts: 33,457
Total Cats: 6,874
Say what you will, but the Vaio TX and TZ-class machines are just about the awesomest notebook since the Tandy 100. Sure they ain't the fastest thing in the universe, but they're smaller than hustler's dick and yet they still have built-in optical drives.
#20
Takes a list of log files on the command line. Strips out marks. Adds an offset to the timestamp on later logs. Dumps the output to the console, so you have to pipe it into a new file.
Requires .NET 2.0 Runtime
Zip:
MegaLogCombine.zip
Code:
-Mike
Requires .NET 2.0 Runtime
Zip:
MegaLogCombine.zip
Code:
Code:
using System; using System.Collections.Generic; using System.Text; using System.IO; namespace MegaLogCombine { class Program { static void Main( string[] args ) { double timeStampOffset = 0; double currentTimeStamp = 0; int iLog = 0; int fieldCount = 0; foreach ( string arg in args ) { string[] lines = File.ReadAllLines( arg, Encoding.ASCII ); for ( int iLine = 0; iLine < lines.Length; ++iLine ) { string line = lines[ iLine ]; // strip out the first two lines of all logs besides the first if ( iLine < 2 ) { if ( iLog == 0 ) Console.WriteLine( line ); continue; } // Don't want mark lines if ( line.Contains( "\"MARK" ) ) continue; string[] fields = line.Split('\t'); // don't do empty lines if ( fields == null || fields.Length < 1 ) continue; if ( fieldCount == 0 ) fieldCount = fields.Length; else if ( fieldCount != fields.Length ) throw new System.FormatException( String.Format( "Found {0} fields, expected {1} fields, on line: {2}", fieldCount, fields.Length, line ) ); double timeStamp; if ( !double.TryParse( fields[0], out timeStamp ) ) throw new System.FormatException( String.Format( "Failed to parse timestamp from line {0}", line ) ); currentTimeStamp = timeStamp + timeStampOffset; fields[ 0 ] = currentTimeStamp.ToString(); StringBuilder builder = new StringBuilder(); for ( int i = 0; i < fields.Length; ++i ) { if ( i > 0 ) builder.Append( '\t' ); builder.Append( fields[i] ); } Console.WriteLine( builder.ToString() ); } timeStampOffset = currentTimeStamp; ++iLog; } } } }