These forums have been archived and are now read-only.

The new forums are live and can be found at https://forums.eveonline.com/

EVE Technology Lab

 
  • Topic is locked indefinitely.
 

need some help php json to my sql

First post
Author
MI Alar
Cloaked Goof
Goonswarm Federation
#1 - 2016-07-06 00:25:07 UTC
im downloading the 9 pages with

Quote:
copy('https://crest-tq.eveonline.com/market/10000002/orders/all/','big1.file');
copy('rest-tq.eveonline.com/market/10000002/orders/all/?page=2','big2.file');
copy('rest-tq.eveonline.com/market/10000002/orders/all/?page=3','big3.file');
copy('rest-tq.eveonline.com/market/10000002/orders/all/?page=4','big4.file');
copy('rest-tq.eveonline.com/market/10000002/orders/all/?page=5','big5.file');
copy('rest-tq.eveonline.com/market/10000002/orders/all/?page=6','big6.file');
copy('est-tq.eveonline.com/market/10000002/orders/all/?page=7','big7.file');
copy('est-tq.eveonline.com/market/10000002/orders/all/?page=8','big8.file');
copy('rest-tq.eveonline.com/market/10000002/orders/all/?page=9','big9.file');
?>


that is working
that part i cant get to work it

Quote:
ini_set('memory_limit', '100M'); // due to memory error
//connect to mysql db
$con = mysql_connect("*****","****","****","") or die('Could not connect: ' . mysql_error());
//connect to the employee database
mysql_select_db("*****", $con);

//read the json file contents
$jsondata = file_get_contents('big1.file');

//convert json object to php associative array
$data = json_decode($jsondata, true);
foreach ($data as $row)

//get the items details
$Buy = $row['items']['buy'];
$Issued = $row['items']['issued'];
$Price = $row['items']['price'];
$Duration = $row['items']['duration'];
$ID = $row['items']['id'];
$Minvolume = $row['items']['minvolume'];
$Volume_Entered = $row['items']['volumeEntered'];
$RangeR = $row['items']['range'];
$Station = $row['items']['station'];
$Type = $row['items']['type'];


//insert into mysql table
$sql = "INSERT INTO `Items`(`Buy`, `Issued`, `Price`, `Duration`, `ID`, `Minvolume`, `Volume Entered`, `RangeR`, `Station`, `Type`)
VALUES('$Buy', '$Issued', '$Price', '$Duration', '$ID', '$Minvolume', '$Volume_Entered', '$RangeR', '$Station', '$Type')";
ECHO $sql;

if(!mysql_query($sql,$con))
{
die('Error : ' . mysql_error());
}
?>


if i try to echo $price it is empty any ideas?
Blacksmoke16
Imperial Academy
#2 - 2016-07-06 01:35:30 UTC  |  Edited by: Blacksmoke16
First thought that comes to mind, you know you have:

rest-tq.eveonline.com/market/10000002/orders/all/

Which doesn't actually work; are missing the https://c on crest. Should be:

https://crest-tq.eveonline.com/market/10000002/orders/all/
MI Alar
Cloaked Goof
Goonswarm Federation
#3 - 2016-07-06 01:46:40 UTC
Blacksmoke16 wrote:
First thought that comes to mind, you know you have:

rest-tq.eveonline.com/market/10000002/orders/all/

Which doesn't actually work; are missing the https://c on crest. Should be:

https://crest-tq.eveonline.com/market/10000002/orders/all/

the forums gave a error about http with the full url
Blacksmoke16
Imperial Academy
#4 - 2016-07-06 02:09:11 UTC
Ok i fixed it:

Your
foreach ($data as $row)
should be
foreach ($data['items'] as $row)
Saying do this for each obj in items array. Not for each on the whole dataset itself.

Also can just store the files as JSON files, like page1.json etc since the CREST endpoint returns json.
Steve Ronuken
Fuzzwork Enterprises
Vote Steve Ronuken for CSM
#5 - 2016-07-06 12:37:05 UTC
Something to be aware of: This is quite a lot of data you're pulling. databases tend to react badly to pulling a lot, then deleting it. (if you're truncating, that's another matter.)

Aside from that, if you're not wanting to retain the data long term, why not just grab the files, keep them in memory, process them, then toss it? yes, it'll increase your memory requirements a bit, but it's entirely doable. Just use something like curl to get the contents (handles things like redirects better. copy isn't ideal for web stuff), and dump straight to a variable.

Woo! CSM XI!

Fuzzwork Enterprises

Twitter: @fuzzysteve on Twitter