Announcement

Collapse
No announcement yet.

Reading large data chunks into a table

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Reading large data chunks into a table

    I'm using AMS to read 110,000 items of data from a textfile into a table, and then looping through the table to return specific items from that list.

    The 110,000 items consist of URL strings which creates a 6.6mb textfile that AMS has to first read into a table, and then loop through, in order to return the desired result.

    Currently, it takes AMS 25.6 seconds to read in the data and perform the table loop. Obviously, this is way too long. I need to get the return time to under 5 seconds, and the data can not be first embedded as a table - it needs to remain remote.

    What are my best options for getting AMS to perform this task in an 'acceptable' time?

  • #2
    Instead of using a text file, I would use a SQLite database - which also can be downloaded at runtime from a remote server, yet it can be queried immediately after the transfer, without any delays for parsing and ordering.

    Ulrich

    Comment


    • #3
      I was going to say something like change format to ini or xml and call the data when needed but ya the SQLite option I keep forgetting about.

      I posted once before but it didn't post.
      Plugins or Sources MokoX
      BunnyHop 2021 Here

      Comment


      • #4
        Okay, thanks. I had a feeling SQLite would be the best route but was secretly hoping for a different option because I was having such a hard time wrapping my head around it.

        That's okay, it's all good learning, isn't it? I'll go back to those tutorial links for using SQLite which you suggested a few weeks ago, Ulrich. And work from there. Many thanks to both of you.

        Comment


        • #5
          Originally posted by BioHazard View Post
          Okay, thanks. I had a feeling SQLite would be the best route but was secretly hoping for a different option because I was having such a hard time wrapping my head around it.

          That's okay, it's all good learning, isn't it? I'll go back to those tutorial links for using SQLite which you suggested a few weeks ago, Ulrich. And work from there. Many thanks to both of you.
          I have been using AMS for a **** of along time now and I am still learning a lot what it can and can't do, and for a every day little app I think it still has SO much more
          Plugins or Sources MokoX
          BunnyHop 2021 Here

          Comment


          • #6
            This might be of interest to anyone else trying to work with very large data files:

            This is a pattern-matching example that goes through a data file of 43 megabytes. It took 3283 seconds to execute, using this code:
            Code:
            fh = io.open("ac_short")
            running = true
            
            mcount = 0
            count = 0
            
            while running do
                    line = fh:read()
                    if line ~= nil then
                            count = count + 1
                            n,m,ip,query = string.find(line,'(%S+).*"http.*%.google%..*[&?]q=(.-)[&"]')
                            if n ~= nil then
                                    mcount = mcount + 1
                                    if query ~= nil then
                                            print (count,mcount,ip,query)
                                    end
                            end
                    else
                            running = false
                    end
            end
            taken = os.clock()
            print ("time taken in seconds - ",taken)
            ... and just 3.25 seconds (ie. x1000 faster!) using this code:

            Code:
            fh = io.open("ac_short")
            running = true
            
            mcount = 0
            count = 0
            
            while running do
                    line = fh:read()
                    if line ~= nil then
                            count = count + 1
                            n,m = string.find(line,'%.google%.')
                            if n ~= nil then
                                    n,m,ip = string.find(line,'(%S+)')
                                    n,m,query = string.find(line,'"http.-[&?]q=(.-)[&"]')
                                    if query ~= nil then
                                            mcount = mcount + 1
                                            print (count,mcount,ip,query)
                                    end
                            end
                    else
                            running = false
                    end
            end
            taken = os.clock()
            print ("time taken in seconds - ",taken)
            Quite remarkable, I think.

            The source article which elaborates this code (very interesting), and from which this example came can be found here.

            Comment

            Working...
            X