Results 1 to 18 of 18

Thread: [python 3] Web scraping GE prices [Bs4] [

  1. #1
    Join Date
    Jan 2013
    Posts
    146
    Mentioned
    0 Post(s)
    Quoted
    56 Post(s)

    Default [python 3] Web scraping GE prices [Bs4] [

    I seen code Garret wrote for a merchant aid script and thought it was really cool.

    Thought it would be good project to work on in python to learn webscraping.

    So far my script pulls the list of the top 100 items from runescape.
    Then goes through that list and pulls the GE price of each item from runescape.wikia.

    the trouble im running into at the moment is i'm not able to get the GE price in the form of an integer and instead get it as a string with [] brackets around it.

    so instead of
    Code:
    Spirit shards
    23
    Coal
    431
    im getting
    Code:
    Spirit shards
    [23]
    Coal
    [431]
    Fire rune
    [71]
    Air rune
    [24]
    python Code:
    from bs4 import BeautifulSoup
    import requests, time, openpyxl, lxml
    from openpyxl.cell import get_column_letter

    top100 = 'http://services.runescape.com/m=itemdb_rs/top100'
    wikiroot = 'http://runescape.wikia.com/wiki/'

    # Class ="content"
    # <tbody>
    #<a class="table-item-link">

    class Portfolio():
        def __init__(self):
            self.file = ('Portfolio.xlsx')
            self.wb = openpyxl.load_workbook(self.file)
            self.sheet = self.wb.get_sheet_by_name('Sheet1')

    class SoupObject():
        def __init__(self, url):
            self.url = url
        def setup(self):
            self.res = requests.get(self.url)
            self.page = BeautifulSoup(self.res.text, 'lxml')
        def search_top100(self):
            self.items = self.page.select('a.table-item-link')

    class Item():
        def __init__(self, name):
            self.name = name
            self.price = 0

        # builds hyper link for wikia page  
        def build_link(self):
            space = ' '
            wikiend = (ltr if ltr not in space else '_' for ltr in self.name)
            self.wikilink = wikiroot + ''.join(wikiend)
           
           
    rstop100 = SoupObject(top100)
    rstop100.setup()
    rstop100.search_top100()
    tempitem = Item('Temp')


    def grab_prices(iten):
        for item in rstop100.items:
            iten.name = item.get('title')
            iten.build_link()
            res = requests.get(iten.wikilink)
            page = BeautifulSoup(res.text, 'lxml')
            dirty_price = page.find_all(class_="GEItem")
            dp = BeautifulSoup(str(dirty_price), 'lxml')
            price = dp.text
            print(iten.name)
            print(price)
           
            time.sleep(.4)

    grab_prices(tempitem)

    Im Back... Previously known as Megaleech
    [Herbalife]

  2. #2
    Join Date
    Apr 2013
    Posts
    680
    Mentioned
    13 Post(s)
    Quoted
    341 Post(s)

    Default

    I have not written in python in a long time.. But i think i have resolved an issue like this before.

    Convert the 'string' to a Float first; then integer?

    Code:
    a =['71']
     float(a)
    71.0000
     int(float(a))
    71
    This might be out of my league; but i will be following you foot steps next week when i make my merchant script for OSRS

    <------------------>



  3. #3
    Join Date
    Jan 2013
    Posts
    146
    Mentioned
    0 Post(s)
    Quoted
    56 Post(s)

    Default

    Quote Originally Posted by AFools View Post
    I have not written in python in a long time.. But i think i have resolved an issue like this before.

    Convert the 'string' to a Float first; then integer?

    Code:
    a =['71']
     float(a)
    71.0000
     int(float(a))
    71
    This might be out of my league; but i will be following you foot steps next week when i make my merchant script for OSRS
    Thanks, lol it might be out of my league but im gonna try anyway
    tried your solution just in case, didn't work. Whats its reading would be a = '[71]' like the brackets are part of the string so i can't convert it into a float or integer.



    I was thinking i could do list comprehension like i did to build the link the the wiki pages, but im not very good with them so i couldn't figure out how to make it work.


    i think i figured it out with this
    python Code:
    def grab_prices(iten):
        for item in rstop100.items:
            iten.name = item.get('title')
            iten.build_link()
            res = requests.get(iten.wikilink)
            page = BeautifulSoup(res.text, 'lxml')
            dirty_price = page.find_all(class_="GEItem")
            dp = BeautifulSoup(str(dirty_price), 'lxml')
            still_dirty = dp.text
            price = (i for i in list(still_dirty) if i not in '[]')
            price = ''.join(price)
            print(iten.name)
            print(price)

    im not really happy with it this way... any ideas on how i can make this a way cleaner process?

    Im Back... Previously known as Megaleech
    [Herbalife]

  4. #4
    Join Date
    Apr 2013
    Posts
    680
    Mentioned
    13 Post(s)
    Quoted
    341 Post(s)

    Default

    I'm sure one of the big wigs will come by and offer a solution; making me look like an 'ant' =P

    but in the mean time; i like to explore these problems..

    http://stackoverflow.com/questions/4...-from-a-string

    notably the part where

    Code:
    str = "h3110 23 cat 444.4 rabbit 11 2 dog"
    [int(s) for s in str.split() if s.isdigit()]
    [23, 11, 2]

    <------------------>



  5. #5
    Join Date
    Jan 2013
    Posts
    146
    Mentioned
    0 Post(s)
    Quoted
    56 Post(s)

    Default

    Quote Originally Posted by AFools View Post
    I'm sure one of the big wigs will come by and offer a solution; making me look like an 'ant' =P

    but in the mean time; i like to explore these problems..

    http://stackoverflow.com/questions/4...-from-a-string

    notably the part where

    Code:
    str = "h3110 23 cat 444.4 rabbit 11 2 dog"
    [int(s) for s in str.split() if s.isdigit()]
    [23, 11, 2]
    good find, that's definitely gonna come in handy.

    It wouldn't split (i think because there are no spaces?) but i think i figured it out for the moment by turning it into a list and going through it. but it seems like way to many steps for something so simple.


    Another hurdle in the process i still haven't figured out is how to get prices from days in the past. at the moment i've only been able to figure out how to grab the current price on the GE

    Im Back... Previously known as Megaleech
    [Herbalife]

  6. #6
    Join Date
    Apr 2013
    Posts
    680
    Mentioned
    13 Post(s)
    Quoted
    341 Post(s)

    Default

    Well for OSRS i would more than likely use the OSBUDDY API if possible. OSbuddy or something similar for RS3?

    "isdigit" should grab all integers; i would query if it will return 'Floats' as some of the Prices you will grab will be 374.XX M

    As for storage? Just and idea? make your own history? have the 'data spread' output to a excel (or similar); after a month you will have your DATA set.

    This is a pain; but offers greater ability to manipulate data set for variance and volume - Volume is the key to trading financial market and is and interchangeable premise for Runscapes GE.

    https://villavu.com/forum/showthread...35#post1335135


    To put it back into real life perspective?

    https://www.mql5.com/en/code/mt4/experts

    <------------------>



  7. #7
    Join Date
    Jan 2013
    Posts
    146
    Mentioned
    0 Post(s)
    Quoted
    56 Post(s)

    Default

    lol it looks like your inspiration is similar to mine,
    From what i've seen on rs wiki it looks like all the prices are exact GE prices and not rounded so i don't think floats will be a problem.

    That's what i was thinking, hence the Portfolio class i started, just thought it would be nice if i could grab it all at once. that way you could just pick it up and go and be useful in case of missing a day of information.

    Im Back... Previously known as Megaleech
    [Herbalife]

  8. #8
    Join Date
    Apr 2013
    Posts
    680
    Mentioned
    13 Post(s)
    Quoted
    341 Post(s)

    Default

    Quote Originally Posted by Saint//+ View Post
    lol it looks like your inspiration is similar to mine,
    From what i've seen on rs wiki it looks like all the prices are exact GE prices and not rounded so i don't think floats will be a problem.

    That's what i was thinking, hence the Portfolio class i started, just thought it would be nice if i could grab it all at once. that way you could just pick it up and go and be useful in case of missing a day of information.
    Did you look through the source code for the page?

    view-source:http://services.runescape.com/m=itemdb_rs/Purple_partyhat/viewitem?obj=1046

    line 275 = average30.push([new Date('2016/01/02'), 2109839339, 2136800938]); - we already have discrepancy (which means nothing to me atm; as i don't know the exact price - to compare)
    could this be the average buy /sell?

    Could someone who plays RS3 provide the exact price of a Purple PHAT through GE?

    <------------------>



  9. #9
    Join Date
    Feb 2012
    Location
    Norway
    Posts
    995
    Mentioned
    145 Post(s)
    Quoted
    596 Post(s)

    Default

    I think this should work fine..:
    python Code:
    #!python
    from bs4 import BeautifulSoup
    import requests, time

    top100 = 'http://services.runescape.com/m=itemdb_rs/top100'
    wikiroot = 'http://runescape.wikia.com/wiki/'

    class SoupObject():
        def __init__(self, url):
            self.url = url
        def setup(self):
            self.res = requests.get(self.url)
            self.page = BeautifulSoup(self.res.text)
        def search_top100(self):
            self.items = self.page.select('a.table-item-link')

    class Item():
        def __init__(self, name):
            self.name = name
            self.price = 0

        # builds hyper link for wikia page  
        def build_link(self):
            wikiend = self.name.replace(' ', '_')
            self.wikilink = wikiroot + wikiend
           
           
    rstop100 = SoupObject(top100)
    rstop100.setup()
    rstop100.search_top100()
    tempitem = Item('Temp')


    def grab_prices(item):
        for node in rstop100.items:
            item.name = node.get('title')
            item.build_link()
            res = requests.get(item.wikilink)
            page = BeautifulSoup(res.text)
            price = page.find(class_="GEItem").find(text=True)
           
            print(item.name)
            print(price)     #int(price) to convert it to integer now..
           
            time.sleep(.4)

    grab_prices(tempitem)

    I removed some (unused) pieces of code - I don't have lxml & openpyxl installed.
    Last edited by slacky; 01-03-2016 at 01:31 AM.
    !No priv. messages please

  10. #10
    Join Date
    Jan 2013
    Posts
    146
    Mentioned
    0 Post(s)
    Quoted
    56 Post(s)

    Default

    nice, definitely a lot cleaner that way thanks @slacky


    @AFools i seen those too, I'm not sure if im currently capable of figuring out the math i'd need to find the prices with those. definitely something im looking into.

    Im Back... Previously known as Megaleech
    [Herbalife]

  11. #11
    Join Date
    Dec 2007
    Posts
    289
    Mentioned
    4 Post(s)
    Quoted
    86 Post(s)

    Default

    Is there a reason you're not using the GE API?
    http://services.runescape.com/m=rswi..._Exchange_APIs

    Example:
    http://services.runescape.com/m=item...json?item=4798
    Code:
    {"item":{"icon":"http://services.runescape.com/m=itemdb_rs/5060_obj_sprite.gif?id=4798","icon_large":"http://services.runescape.com/m=itemdb_rs/5060_obj_big.gif?id=4798","id":4798,"type":"Ammo","typeIcon":"http://www.runescape.com/img/categories/Ammo","name":"Adamant brutal","description":"Blunt adamantite arrow...ouch","current":{"trend":"neutral","price":319},"today":{"trend":"neutral","price":0},"members":"true","day30":{"trend":"positive","change":"+2.0%"},"day90":{"trend":"positive","change":"+7.0%"},"day180":{"trend":"positive","change":"+22.0%"}}}

  12. #12
    Join Date
    Jan 2013
    Posts
    146
    Mentioned
    0 Post(s)
    Quoted
    56 Post(s)

    Default

    Quote Originally Posted by honeyhoney View Post
    Is there a reason you're not using the GE API?
    http://services.runescape.com/m=rswi..._Exchange_APIs

    Example:
    http://services.runescape.com/m=item...json?item=4798
    Code:
    {"item":{"icon":"http://services.runescape.com/m=itemdb_rs/5060_obj_sprite.gif?id=4798","icon_large":"http://services.runescape.com/m=itemdb_rs/5060_obj_big.gif?id=4798","id":4798,"type":"Ammo","typeIcon":"http://www.runescape.com/img/categories/Ammo","name":"Adamant brutal","description":"Blunt adamantite arrow...ouch","current":{"trend":"neutral","price":319},"today":{"trend":"neutral","price":0},"members":"true","day30":{"trend":"positive","change":"+2.0%"},"day90":{"trend":"positive","change":"+7.0%"},"day180":{"trend":"positive","change":"+22.0%"}}}
    i didn't even know it existed

    edit:

    This seems like a weird way to keep track of the date
    Code:
    {"daily":{"Datecode":Price, "Datecode2":Price2...}
    where Datecode is the number of milliseconds elapsed from January 1st, 1970 and Price is the item's market price for that day.

    edit2:

    I found where the information i want is in the source of the item page, im just not sure how to select only the information i want (average180)

    Code:
    <script>
                    function scaleLabelFilter(number) {
                        var decPlaces = Math.pow(10,1), abbrev = ["k", "m", "b", "t"];
                        for (var i=abbrev.length-1; i>=0; i--) {
                            var size = Math.pow(10,(i+1)*3);
                            if(size <= number) {
                                number = Math.round(number*decPlaces/size)/decPlaces;
                                if((number == 1000) && (i < abbrev.length - 1)) {
                                    number = 1;
                                    i++;
                                }
                                number += abbrev[i];
                                break;
                            }
                        }
    
                        return number;
                    }
    				var average30 = [['Date','Daily','Average']], average90 = [['Date','Daily','Average']], average180 = [['Date','Daily','Average']], trade30 =[['Date','Total']], trade90 =[['Date','Total']], trade180 =[['Date','Total']];
    					average30.push([new Date('2015/12/08'), 779, 813]);
    					
    					average30.push([new Date('2015/12/09'), 779, 810]);
    					
    					average30.push([new Date('2015/12/10'), 775, 808]);
    					
    					average30.push([new Date('2015/12/11'), 770, 805]);
    					
    					average30.push([new Date('2015/12/12'), 772, 803]);
    					
    					average30.push([new Date('2015/12/13'), 775, 800]);
    					
    					average30.push([new Date('2015/12/14'), 778, 798]);
    					
    					average30.push([new Date('2015/12/15'), 779, 796]);
    					
    					average30.push([new Date('2015/12/16'), 778, 795]);
    					
    					average30.push([new Date('2015/12/17'), 783, 793]);
    					
    					average30.push([new Date('2015/12/18'), 786, 792]);
    					
    					average30.push([new Date('2015/12/19'), 786, 791]);
    					
    					average30.push([new Date('2015/12/20'), 772, 790]);
    					
    					average30.push([new Date('2015/12/21'), 777, 788]);
    					
    					average30.push([new Date('2015/12/22'), 791, 788]);
    					
    					average30.push([new Date('2015/12/23'), 796, 787]);
    					
    					average30.push([new Date('2015/12/24'), 802, 787]);
    					
    					average30.push([new Date('2015/12/25'), 793, 786]);
    					
    					average30.push([new Date('2015/12/26'), 790, 786]);
    					
    					average30.push([new Date('2015/12/27'), 783, 786]);
    					
    					average30.push([new Date('2015/12/28'), 780, 785]);
    					
    					average30.push([new Date('2015/12/29'), 777, 784]);
    					
    					average30.push([new Date('2015/12/30'), 784, 783]);
    					
    					average30.push([new Date('2015/12/31'), 789, 783]);
    					
    					average30.push([new Date('2016/01/01'), 784, 782]);
    					
    					average30.push([new Date('2016/01/02'), 786, 782]);
    					
    					average30.push([new Date('2016/01/03'), 792, 782]);
    					
    					average30.push([new Date('2016/01/04'), 787, 782]);
    					
    					average30.push([new Date('2016/01/05'), 794, 783]);
    					
    					average30.push([new Date('2016/01/06'), 794, 783]);
    					
                    	average90.push([new Date('2015/10/09'), 842, 858]);
    					
                    	average90.push([new Date('2015/10/10'), 843, 857]);
    					
                    	average90.push([new Date('2015/10/11'), 843, 855]);
    					
                    	average90.push([new Date('2015/10/12'), 840, 854]);
    					
                    	average90.push([new Date('2015/10/13'), 846, 854]);
    					
                    	average90.push([new Date('2015/10/14'), 852, 853]);
    					
                    	average90.push([new Date('2015/10/15'), 853, 853]);
    					
                    	average90.push([new Date('2015/10/16'), 854, 853]);
    					
                    	average90.push([new Date('2015/10/17'), 860, 852]);
    					
                    	average90.push([new Date('2015/10/18'), 858, 852]);
    					
                    	average90.push([new Date('2015/10/19'), 860, 851]);
    					
                    	average90.push([new Date('2015/10/20'), 858, 851]);
    					
                    	average90.push([new Date('2015/10/21'), 855, 851]);
    					
                    	average90.push([new Date('2015/10/22'), 852, 850]);
    					
                    	average90.push([new Date('2015/10/23'), 851, 850]);
    					
                    	average90.push([new Date('2015/10/24'), 847, 849]);
    					
                    	average90.push([new Date('2015/10/25'), 841, 848]);
    					
                    	average90.push([new Date('2015/10/26'), 836, 847]);
    					
                    	average90.push([new Date('2015/10/27'), 831, 846]);
    					
                    	average90.push([new Date('2015/10/28'), 832, 846]);
    					
                    	average90.push([new Date('2015/10/29'), 835, 846]);
    					
                    	average90.push([new Date('2015/10/30'), 835, 845]);
    					
                    	average90.push([new Date('2015/10/31'), 835, 844]);
    					
                    	average90.push([new Date('2015/11/01'), 835, 843]);
    					
                    	average90.push([new Date('2015/11/02'), 834, 843]);
    					
                    	average90.push([new Date('2015/11/03'), 833, 842]);
    					
                    	average90.push([new Date('2015/11/04'), 830, 842]);
    					
                    	average90.push([new Date('2015/11/05'), 846, 843]);
    					
                    	average90.push([new Date('2015/11/06'), 870, 844]);
    					
                    	average90.push([new Date('2015/11/07'), 875, 846]);
    					
                    	average90.push([new Date('2015/11/08'), 870, 847]);
    					
                    	average90.push([new Date('2015/11/09'), 857, 847]);
    					
                    	average90.push([new Date('2015/11/10'), 849, 847]);
    					
                    	average90.push([new Date('2015/11/11'), 850, 848]);
    					
                    	average90.push([new Date('2015/11/12'), 852, 848]);
    					
                    	average90.push([new Date('2015/11/13'), 848, 848]);
    					
                    	average90.push([new Date('2015/11/14'), 842, 847]);
    					
                    	average90.push([new Date('2015/11/15'), 830, 846]);
    					
                    	average90.push([new Date('2015/11/16'), 826, 845]);
    					
                    	average90.push([new Date('2015/11/17'), 822, 844]);
    					
                    	average90.push([new Date('2015/11/18'), 816, 843]);
    					
                    	average90.push([new Date('2015/11/19'), 816, 841]);
    					
                    	average90.push([new Date('2015/11/20'), 817, 840]);
    					
                    	average90.push([new Date('2015/11/21'), 819, 839]);
    					
                    	average90.push([new Date('2015/11/22'), 818, 838]);
    					
                    	average90.push([new Date('2015/11/23'), 812, 837]);
    					
                    	average90.push([new Date('2015/11/24'), 807, 835]);
    					
                    	average90.push([new Date('2015/11/25'), 806, 834]);
    					
                    	average90.push([new Date('2015/11/26'), 801, 833]);
    					
                    	average90.push([new Date('2015/11/27'), 800, 832]);
    					
                    	average90.push([new Date('2015/11/28'), 808, 831]);
    					
                    	average90.push([new Date('2015/11/29'), 807, 831]);
    					
                    	average90.push([new Date('2015/11/30'), 804, 830]);
    					
                    	average90.push([new Date('2015/12/01'), 798, 828]);
    					
                    	average90.push([new Date('2015/12/02'), 794, 827]);
    					
                    	average90.push([new Date('2015/12/03'), 788, 825]);
    					
                    	average90.push([new Date('2015/12/04'), 791, 824]);
    					
                    	average90.push([new Date('2015/12/05'), 784, 822]);
    					
                    	average90.push([new Date('2015/12/06'), 782, 819]);
    					
                    	average90.push([new Date('2015/12/07'), 780, 816]);
    					
                    	average90.push([new Date('2015/12/08'), 779, 813]);
    					
                    	average90.push([new Date('2015/12/09'), 779, 810]);
    					
                    	average90.push([new Date('2015/12/10'), 775, 808]);
    					
                    	average90.push([new Date('2015/12/11'), 770, 805]);
    					
                    	average90.push([new Date('2015/12/12'), 772, 803]);
    					
                    	average90.push([new Date('2015/12/13'), 775, 800]);
    					
                    	average90.push([new Date('2015/12/14'), 778, 798]);
    					
                    	average90.push([new Date('2015/12/15'), 779, 796]);
    					
                    	average90.push([new Date('2015/12/16'), 778, 795]);
    					
                    	average90.push([new Date('2015/12/17'), 783, 793]);
    					
                    	average90.push([new Date('2015/12/18'), 786, 792]);
    					
                    	average90.push([new Date('2015/12/19'), 786, 791]);
    					
                    	average90.push([new Date('2015/12/20'), 772, 790]);
    					
                    	average90.push([new Date('2015/12/21'), 777, 788]);
    					
                    	average90.push([new Date('2015/12/22'), 791, 788]);
    					
                    	average90.push([new Date('2015/12/23'), 796, 787]);
    					
                    	average90.push([new Date('2015/12/24'), 802, 787]);
    					
                    	average90.push([new Date('2015/12/25'), 793, 786]);
    					
                    	average90.push([new Date('2015/12/26'), 790, 786]);
    					
                    	average90.push([new Date('2015/12/27'), 783, 786]);
    					
                    	average90.push([new Date('2015/12/28'), 780, 785]);
    					
                    	average90.push([new Date('2015/12/29'), 777, 784]);
    					
                    	average90.push([new Date('2015/12/30'), 784, 783]);
    					
                    	average90.push([new Date('2015/12/31'), 789, 783]);
    					
                    	average90.push([new Date('2016/01/01'), 784, 782]);
    					
                    	average90.push([new Date('2016/01/02'), 786, 782]);
    					
                    	average90.push([new Date('2016/01/03'), 792, 782]);
    					
                    	average90.push([new Date('2016/01/04'), 787, 782]);
    					
                    	average90.push([new Date('2016/01/05'), 794, 783]);
    					
                    	average90.push([new Date('2016/01/06'), 794, 783]);
    					
                    	average180.push([new Date('2015/07/11'), 913, 971]);
    					
                    	average180.push([new Date('2015/07/12'), 917, 967]);
    					
                    	average180.push([new Date('2015/07/13'), 923, 963]);
    					
                    	average180.push([new Date('2015/07/14'), 924, 960]);
    					
                    	average180.push([new Date('2015/07/15'), 919, 957]);
    					
                    	average180.push([new Date('2015/07/16'), 922, 954]);
    					
                    	average180.push([new Date('2015/07/17'), 923, 952]);
    					
                    	average180.push([new Date('2015/07/18'), 935, 950]);
    					
                    	average180.push([new Date('2015/07/19'), 945, 948]);
    					
                    	average180.push([new Date('2015/07/20'), 934, 946]);
    					
                    	average180.push([new Date('2015/07/21'), 925, 944]);
    					
                    	average180.push([new Date('2015/07/22'), 930, 942]);
    					
                    	average180.push([new Date('2015/07/23'), 938, 941]);
    					
                    	average180.push([new Date('2015/07/24'), 943, 939]);
    					
                    	average180.push([new Date('2015/07/25'), 947, 938]);
    					
                    	average180.push([new Date('2015/07/26'), 936, 936]);
    					
                    	average180.push([new Date('2015/07/27'), 923, 933]);
    					
                    	average180.push([new Date('2015/07/28'), 915, 930]);
    					
                    	average180.push([new Date('2015/07/29'), 909, 927]);
    					
                    	average180.push([new Date('2015/07/30'), 914, 925]);
    					
                    	average180.push([new Date('2015/07/31'), 909, 924]);
    					
                    	average180.push([new Date('2015/08/01'), 909, 923]);
    					
                    	average180.push([new Date('2015/08/02'), 920, 923]);
    					
                    	average180.push([new Date('2015/08/03'), 929, 923]);
    					
                    	average180.push([new Date('2015/08/04'), 927, 923]);
    					
                    	average180.push([new Date('2015/08/05'), 929, 923]);
    					
                    	average180.push([new Date('2015/08/06'), 931, 924]);
    					
                    	average180.push([new Date('2015/08/07'), 926, 924]);
    					
                    	average180.push([new Date('2015/08/08'), 922, 924]);
    					
                    	average180.push([new Date('2015/08/09'), 920, 925]);
    					
                    	average180.push([new Date('2015/08/10'), 914, 925]);
    					
                    	average180.push([new Date('2015/08/11'), 907, 924]);
    					
                    	average180.push([new Date('2015/08/12'), 902, 924]);
    					
                    	average180.push([new Date('2015/08/13'), 906, 923]);
    					
                    	average180.push([new Date('2015/08/14'), 910, 923]);
    					
                    	average180.push([new Date('2015/08/15'), 914, 923]);
    					
                    	average180.push([new Date('2015/08/16'), 907, 922]);
    					
                    	average180.push([new Date('2015/08/17'), 907, 921]);
    					
                    	average180.push([new Date('2015/08/18'), 910, 920]);
    					
                    	average180.push([new Date('2015/08/19'), 913, 919]);
    					
                    	average180.push([new Date('2015/08/20'), 910, 919]);
    					
                    	average180.push([new Date('2015/08/21'), 903, 918]);
    					
                    	average180.push([new Date('2015/08/22'), 899, 917]);
    					
                    	average180.push([new Date('2015/08/23'), 896, 915]);
    					
                    	average180.push([new Date('2015/08/24'), 875, 913]);
    					
                    	average180.push([new Date('2015/08/25'), 878, 911]);
    					
                    	average180.push([new Date('2015/08/26'), 877, 909]);
    					
                    	average180.push([new Date('2015/08/27'), 875, 908]);
    					
                    	average180.push([new Date('2015/08/28'), 887, 907]);
    					
                    	average180.push([new Date('2015/08/29'), 891, 906]);
    					
                    	average180.push([new Date('2015/08/30'), 888, 906]);
    					
                    	average180.push([new Date('2015/08/31'), 884, 905]);
    					
                    	average180.push([new Date('2015/09/01'), 883, 904]);
    					
                    	average180.push([new Date('2015/09/02'), 890, 902]);
    					
                    	average180.push([new Date('2015/09/03'), 898, 901]);
    					
                    	average180.push([new Date('2015/09/04'), 898, 900]);
    					
                    	average180.push([new Date('2015/09/05'), 880, 899]);
    					
                    	average180.push([new Date('2015/09/06'), 874, 897]);
    					
                    	average180.push([new Date('2015/09/07'), 865, 895]);
    					
                    	average180.push([new Date('2015/09/08'), 869, 893]);
    					
                    	average180.push([new Date('2015/09/09'), 883, 892]);
    					
                    	average180.push([new Date('2015/09/10'), 882, 891]);
    					
                    	average180.push([new Date('2015/09/11'), 874, 890]);
    					
                    	average180.push([new Date('2015/09/12'), 871, 889]);
    					
                    	average180.push([new Date('2015/09/13'), 872, 888]);
    					
                    	average180.push([new Date('2015/09/14'), 860, 886]);
    					
                    	average180.push([new Date('2015/09/15'), 862, 885]);
    					
                    	average180.push([new Date('2015/09/16'), 862, 883]);
    					
                    	average180.push([new Date('2015/09/17'), 872, 882]);
    					
                    	average180.push([new Date('2015/09/18'), 879, 881]);
    					
                    	average180.push([new Date('2015/09/19'), 872, 879]);
    					
                    	average180.push([new Date('2015/09/20'), 870, 878]);
    					
                    	average180.push([new Date('2015/09/21'), 865, 877]);
    					
                    	average180.push([new Date('2015/09/22'), 865, 876]);
    					
                    	average180.push([new Date('2015/09/23'), 864, 876]);
    					
                    	average180.push([new Date('2015/09/24'), 864, 875]);
    					
                    	average180.push([new Date('2015/09/25'), 860, 875]);
    					
                    	average180.push([new Date('2015/09/26'), 867, 875]);
    					
                    	average180.push([new Date('2015/09/27'), 860, 874]);
    					
                    	average180.push([new Date('2015/09/28'), 849, 872]);
    					
                    	average180.push([new Date('2015/09/29'), 843, 871]);
    					
                    	average180.push([new Date('2015/09/30'), 853, 870]);
    					
                    	average180.push([new Date('2015/10/01'), 860, 869]);
    					
                    	average180.push([new Date('2015/10/02'), 863, 868]);
    					
                    	average180.push([new Date('2015/10/03'), 853, 867]);
    					
                    	average180.push([new Date('2015/10/04'), 840, 865]);
    					
                    	average180.push([new Date('2015/10/05'), 830, 863]);
    					
                    	average180.push([new Date('2015/10/06'), 826, 861]);
    					
                    	average180.push([new Date('2015/10/07'), 832, 860]);
    					
                    	average180.push([new Date('2015/10/08'), 837, 859]);
    					
                    	average180.push([new Date('2015/10/09'), 842, 858]);
    					
                    	average180.push([new Date('2015/10/10'), 843, 857]);
    					
                    	average180.push([new Date('2015/10/11'), 843, 855]);
    					
                    	average180.push([new Date('2015/10/12'), 840, 854]);
    					
                    	average180.push([new Date('2015/10/13'), 846, 854]);
    					
                    	average180.push([new Date('2015/10/14'), 852, 853]);
    					
                    	average180.push([new Date('2015/10/15'), 853, 853]);
    					
                    	average180.push([new Date('2015/10/16'), 854, 853]);
    					
                    	average180.push([new Date('2015/10/17'), 860, 852]);
    					
                    	average180.push([new Date('2015/10/18'), 858, 852]);
    					
                    	average180.push([new Date('2015/10/19'), 860, 851]);
    					
                    	average180.push([new Date('2015/10/20'), 858, 851]);
    					
                    	average180.push([new Date('2015/10/21'), 855, 851]);
    					
                    	average180.push([new Date('2015/10/22'), 852, 850]);
    					
                    	average180.push([new Date('2015/10/23'), 851, 850]);
    					
                    	average180.push([new Date('2015/10/24'), 847, 849]);
    					
                    	average180.push([new Date('2015/10/25'), 841, 848]);
    					
                    	average180.push([new Date('2015/10/26'), 836, 847]);
    					
                    	average180.push([new Date('2015/10/27'), 831, 846]);
    					
                    	average180.push([new Date('2015/10/28'), 832, 846]);
    					
                    	average180.push([new Date('2015/10/29'), 835, 846]);
    					
                    	average180.push([new Date('2015/10/30'), 835, 845]);
    					
                    	average180.push([new Date('2015/10/31'), 835, 844]);
    					
                    	average180.push([new Date('2015/11/01'), 835, 843]);
    					
                    	average180.push([new Date('2015/11/02'), 834, 843]);
    					
                    	average180.push([new Date('2015/11/03'), 833, 842]);
    					
                    	average180.push([new Date('2015/11/04'), 830, 842]);
    					
                    	average180.push([new Date('2015/11/05'), 846, 843]);
    					
                    	average180.push([new Date('2015/11/06'), 870, 844]);
    					
                    	average180.push([new Date('2015/11/07'), 875, 846]);
    					
                    	average180.push([new Date('2015/11/08'), 870, 847]);
    					
                    	average180.push([new Date('2015/11/09'), 857, 847]);
    					
                    	average180.push([new Date('2015/11/10'), 849, 847]);
    					
                    	average180.push([new Date('2015/11/11'), 850, 848]);
    					
                    	average180.push([new Date('2015/11/12'), 852, 848]);
    					
                    	average180.push([new Date('2015/11/13'), 848, 848]);
    					
                    	average180.push([new Date('2015/11/14'), 842, 847]);
    					
                    	average180.push([new Date('2015/11/15'), 830, 846]);
    					
                    	average180.push([new Date('2015/11/16'), 826, 845]);
    					
                    	average180.push([new Date('2015/11/17'), 822, 844]);
    					
                    	average180.push([new Date('2015/11/18'), 816, 843]);
    					
                    	average180.push([new Date('2015/11/19'), 816, 841]);
    					
                    	average180.push([new Date('2015/11/20'), 817, 840]);
    					
                    	average180.push([new Date('2015/11/21'), 819, 839]);
    					
                    	average180.push([new Date('2015/11/22'), 818, 838]);
    					
                    	average180.push([new Date('2015/11/23'), 812, 837]);
    					
                    	average180.push([new Date('2015/11/24'), 807, 835]);
    					
                    	average180.push([new Date('2015/11/25'), 806, 834]);
    					
                    	average180.push([new Date('2015/11/26'), 801, 833]);
    					
                    	average180.push([new Date('2015/11/27'), 800, 832]);
    					
                    	average180.push([new Date('2015/11/28'), 808, 831]);
    					
                    	average180.push([new Date('2015/11/29'), 807, 831]);
    					
                    	average180.push([new Date('2015/11/30'), 804, 830]);
    					
                    	average180.push([new Date('2015/12/01'), 798, 828]);
    					
                    	average180.push([new Date('2015/12/02'), 794, 827]);
    					
                    	average180.push([new Date('2015/12/03'), 788, 825]);
    					
                    	average180.push([new Date('2015/12/04'), 791, 824]);
    					
                    	average180.push([new Date('2015/12/05'), 784, 822]);
    					
                    	average180.push([new Date('2015/12/06'), 782, 819]);
    					
                    	average180.push([new Date('2015/12/07'), 780, 816]);
    					
                    	average180.push([new Date('2015/12/08'), 779, 813]);
    					
                    	average180.push([new Date('2015/12/09'), 779, 810]);
    					
                    	average180.push([new Date('2015/12/10'), 775, 808]);
    					
                    	average180.push([new Date('2015/12/11'), 770, 805]);
    					
                    	average180.push([new Date('2015/12/12'), 772, 803]);
    					
                    	average180.push([new Date('2015/12/13'), 775, 800]);
    					
                    	average180.push([new Date('2015/12/14'), 778, 798]);
    					
                    	average180.push([new Date('2015/12/15'), 779, 796]);
    					
                    	average180.push([new Date('2015/12/16'), 778, 795]);
    					
                    	average180.push([new Date('2015/12/17'), 783, 793]);
    					
                    	average180.push([new Date('2015/12/18'), 786, 792]);
    					
                    	average180.push([new Date('2015/12/19'), 786, 791]);
    					
                    	average180.push([new Date('2015/12/20'), 772, 790]);
    					
                    	average180.push([new Date('2015/12/21'), 777, 788]);
    					
                    	average180.push([new Date('2015/12/22'), 791, 788]);
    					
                    	average180.push([new Date('2015/12/23'), 796, 787]);
    					
                    	average180.push([new Date('2015/12/24'), 802, 787]);
    					
                    	average180.push([new Date('2015/12/25'), 793, 786]);
    					
                    	average180.push([new Date('2015/12/26'), 790, 786]);
    					
                    	average180.push([new Date('2015/12/27'), 783, 786]);
    					
                    	average180.push([new Date('2015/12/28'), 780, 785]);
    					
                    	average180.push([new Date('2015/12/29'), 777, 784]);
    					
                    	average180.push([new Date('2015/12/30'), 784, 783]);
    					
                    	average180.push([new Date('2015/12/31'), 789, 783]);
    					
                    	average180.push([new Date('2016/01/01'), 784, 782]);
    					
                    	average180.push([new Date('2016/01/02'), 786, 782]);
    					
                    	average180.push([new Date('2016/01/03'), 792, 782]);
    					
                    	average180.push([new Date('2016/01/04'), 787, 782]);
    					
                    	average180.push([new Date('2016/01/05'), 794, 783]);
    					
                    	average180.push([new Date('2016/01/06'), 794, 783]);
    					
    			</script>

    Im Back... Previously known as Megaleech
    [Herbalife]

  13. #13
    Join Date
    Dec 2007
    Posts
    289
    Mentioned
    4 Post(s)
    Quoted
    86 Post(s)

    Default

    Quote Originally Posted by Saint//+ View Post
    This seems like a weird way to keep track of the date
    Code:
    {"daily":{"Datecode":Price, "Datecode2":Price2...}
    where Datecode is the number of milliseconds elapsed from January 1st, 1970 and Price is the item's market price for that day.
    That's the graphing data API. It's quite common to store a value keyed against a timestamp.
    Here's an example from the first value here: http://services.runescape.com/m=item...graph/444.json
    "1436572800000":176 means that 1436572800000 millseconds after Jan 1st 1970 (commonly referred to as "Epoch") Gold ore (item id 444) was valued at 176. If we convert this timestamp to a human readable format it turns out that 1436572800000ms since epoch is Saturday, 11-Jul-15 00:00:00 UTC.
    If you take a look at the next value in the json you'll find that 1436659200000ms since epoch is Sunday, 12-Jul-15 00:00:00 UTC and the item was valued at 172. Neat huh?

    The timestamps used by Jagex are in milliseconds - typically these timestamps are only in seconds so you may have to snip off the last three 0's to use the widely available functions.


    Quote Originally Posted by Saint//+ View Post
    I found where the information i want is in the source of the item page, im just not sure how to select only the information i want (average180)
    You can use the graphing API to retrieve the average price per day for the last 180 days and calculate it from there.

  14. #14
    Join Date
    Feb 2012
    Location
    Norway
    Posts
    995
    Mentioned
    145 Post(s)
    Quoted
    596 Post(s)

    Default

    Quote Originally Posted by Saint//+ View Post
    This seems like a weird way to keep track of the date
    Code:
    {"daily":{"Datecode":Price, "Datecode2":Price2...}
    where Datecode is the number of milliseconds elapsed from January 1st, 1970 and Price is the item's market price for that day.
    This is known as time since the Unix epoch (01/Jan/1970 00:00:00)
    Code:
    //python
    import time
    print time.time()
    
    //php
    print time()
    
    //C
    int main () {
      time_t t = time(NULL);
      printf("%ld\n", t);
      return 0;
    }
    
    ... etc
    Most C-inspired/derived languages support this, and also use it as their standard way to measure time.

    Pascal derails from this uses "TDateTime" which stores time since 30/Dec/1899, Pascal-derived languages tends to do their own little thing.
    Last edited by slacky; 12-08-2017 at 03:21 PM.
    !No priv. messages please

  15. #15
    Join Date
    Nov 2011
    Location
    England
    Posts
    3,072
    Mentioned
    296 Post(s)
    Quoted
    1094 Post(s)

    Default

    Quote Originally Posted by slacky View Post
    Very normal to use, known as time since unix epoch (01/Jan/1970 00:00:00)
    Code:
    //python
    from time import time
    print time()
    
    //php
    print time()
    
    //C
    int main () {
      time_t t = time(NULL);
      printf("%ld\n", t);
      return 0;
    }
    
    ... etc
    As I said, very normal.
    2038 will come sooner rather than later!

  16. #16
    Join Date
    Feb 2012
    Location
    Norway
    Posts
    995
    Mentioned
    145 Post(s)
    Quoted
    596 Post(s)

    Default

    Quote Originally Posted by Olly View Post
    2038 will come sooner rather than later!
    Assuming time is stored as Int32. Python's time() returns a double, so I am good =)
    But surely by that time most existing systems are upgraded.
    Last edited by slacky; 01-07-2016 at 02:26 AM.
    !No priv. messages please

  17. #17
    Join Date
    Jan 2013
    Posts
    146
    Mentioned
    0 Post(s)
    Quoted
    56 Post(s)

    Default

    Quote Originally Posted by honeyhoney View Post
    You can use the graphing API to retrieve the average price per day for the last 180 days and calculate it from there.
    I'm going to have to do some research and learn more about JSON and how to use the api.
    As a temporary patch so i can get my script moving forward and start working on the math part i took that chuck of javascript code as a string and sliced and diced until i was able to take each piece and compile it into a dictionary.

    Quote Originally Posted by slacky View Post
    This is known as time since the Unix epoch (01/Jan/1970 00:00:00)
    Code:
    //python
    import time
    print time.time()
    
    //php
    print time()
    
    //C
    int main () {
      time_t t = time(NULL);
      printf("%ld\n", t);
      return 0;
    }
    
    ... etc
    Most C-inspired/derived languages support this, and also use it as their standard way to measure time.

    Pascal derives from this uses "TDateTime" which stores time since 30/Dec/1899, Pascal-derived languages tends to do their own little thing.

    Interesting, never knew that

    i guess pascal is for the rebellious

    Im Back... Previously known as Megaleech
    [Herbalife]

  18. #18
    Join Date
    Dec 2007
    Posts
    174
    Mentioned
    0 Post(s)
    Quoted
    43 Post(s)

    Default

    Not sure if this will be of any use but here is what I use to build my item list for my discord bot.
    Code:
    import requests
    import time
    import json
    
    # Set the base URL for the Grand Exchange API
    base_url = "http://services.runescape.com/m=itemdb_rs/api/catalogue/detail.json?"
    
    # Define a function to query the API for information about a specific item
    def get_item_info(item_id):
        # Build the API request URL
        request_url = base_url + "item=" + str(item_id)
    
        # Send the request and retrieve the response
        response = requests.get(request_url)
    
        # Check if the request was successful
        if response.status_code == 200:
            # If the request was successful, return the data from the response as a Python dictionary
            return response.json()
        else:
            # If the request was unsuccessful, return None
            return None
    
    # Initialize an empty list to store the responses
    responses = []
    
    # Iterate over a range of item IDs
    for item_id in range(1, 55000):
        # Add a 3-second delay between requests to avoid exceeding the rate limit
        time.sleep(4)
    
        # Print the item ID being queried
        print(f"Querying item ID {item_id}...")
    
        # Query the API for information about the current item
        item_info = get_item_info(item_id)
    
        # Check if the item information was successfully retrieved
        if item_info is not None:
            # If the item information was successfully retrieved, check if the item has a name
            if "name" in item_info["item"]:
                # If the item has a name, add the item ID and name to the list of responses
                responses.append({
                    "item_id": item_id,
                    "name": item_info["item"]["name"]
                })
                #append the item id and name to ItemList.txt
                with open("ItemList.txt", "a") as f:
                    f.write(str(item_id) + " " + item_info["item"]["name"] + "\n")
            else:
                # If the item does not have a name, print an error message
                print(f"Error: No name found for item ID {item_id}")
        else:
            # If the item information was not successfully retrieved, print an error message
            print(f"Error: Failed to retrieve information for item ID {item_id}")
    
    
    
    # Print the list of responses
    print(responses)
    and here is the github link where you can grab the item list as well.

    https://github.com/westkevin12/Runescape-3-Item-ID-List

Thread Information

Users Browsing this Thread

There are currently 1 users browsing this thread. (0 members and 1 guests)

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •