I have to daily dowload through http a gz archive which only contains an xml file, in order to use that file in my php application. As far I tried different solutions, trying to both download directly from the URL with http or download it first on the server and then extract it locally, but none of them is working:
Note that I didn't put the real URL
1.
file_put_contents("test.xml",
simplexml_load_file("compress.zlib://5699.xml.gz"));
it creates an xml file with about 44k rows (the original xml has 420k rows), but all the rows are just empty
2.
$stringCompressed= file_get_contents("http://feeds.website.uk/5699.xml.gz");
echo $stringCompressed;
$xmlString = gzuncompress($stringCompressed);
file_put_contents("test.xml", simplexml_load_string($xmlString));
here the echo prints out something like this:
‹£:Z/tmp/feed/final/matchinjobs/jobs_IT_5699.xmlìýÙn$YšŠÞë)¶2p:mÕÕ°1Ò»8•;#R•7‚ÑÝHZ…»ËÍQôEûæûRº 8- ±qî„#ào’/°_áüÃZfËœîœ2ÄlQC%ƒ´aÙ2[ßúÇïûÝßþu6ÕnŠE]Vó¿ù×F_ÿ×Z1W“r~ù7ÿúÃY¶çÿkíoÿ¯~÷çê¼þý¿¢ÿþþ_iÚïÆù²¸¬·¿?[äóúºZ,µ}mZ]–õ²×ðóç|Q\U«ºøÝ~s,ž8)êñ¢¼^Âý~Ÿ¥Ã8<Ò~þGm5ÿ.ÿRóI®M‹|R,´y1Õêb¹¬…6)¦Ó¼¹z®áoJm¹ÈéÆ¥6®æZ5]‘®]”Ó2Ÿ–pÐuùóÿMJÍtŽkœW‹®Xjå\[®–Ë\KW‹ê:ïk§pK¸Ç¼ªá²âžN‹ºÎç“E™kp÷|Viãb1†ßTZ9݇s†c^çç%œ1,®óŲÒê¢<_}ç,´ÁF”kofûoûÚ`®ýeUÔp÷ëª.ùˆñU™ßðð¥Vç‹ŸÿßÚB½*OÀ%œCó/*¾)§«ëkQ5›Á xÌ0qENYÅ“òùøª€!ãlÁøO5+ÇU_ƒW7Ö®å|\^ãÜåËeyS.au©UãñêºÀ1á]ùé?U‹Ë|^~ù’óÐ'꫘Ó/ñopÆáæ³â-Ì+ÜýçÿFWø.ÿeUʇ³ç«ê^âÞÿßÛEµ˜—øš`ªà#XU0´ ^Iu íÁñ×Ój–ãy³|¹ZÐ(÷´ë)üXÌç80sþãK¾öm(|«¹¶XU0£9ü¦º¬à‚ãü:ÓÅáNÍÌÃÏpìRûݾúãw]Ì`\·Åâ÷~³ßí7¿À¿^®ÊÉï]϶MÇsìßíÓ¿ñÓ V‡¸.«j5_ÂJœÁ²?ó_.‹êßMóåïm·o®g8pñ+å€ùåï £o9¾®ËàW|@s«¨š~ÉçÕïö×n.ÿùïàãWñûÓ¼ø$h²ïœ$Z;w‘¾{ú->nç–¿ƒÏùï&€¿7uÃÛ3Ì=ÝøÝ~û[<¦Î§9L‚«ÃÿùݾøþaY.§Åï¸ø~·Ï§áV‹éﯖËëú`ÿóçÏý|òe5Ïûåõ|²ŸOö›ö·7ãD¡áéa&azi§IfGV&ŽgÙžåØžždºñoòÙõ»Õröïêjµ)¬í9âgó§Y1)W³¿¹¾Ó¯pÐóbqþ7|òøj>-'ã¸æïöqŒÿêwû„ºÀ7ÓG` ˆ›•sXã»ðAŽ®«„X‚À"€e· àï>×*XD9rg -Ê\3ZÀÀT@‹ü‘ñŠ0VÉ0žÂhp˜jñÉÑ ý „ŸaQ#~–ðŽ+mRÝ ´\—)î‹‚±¨8ÐÂWˆ˜€9òýoÐ44Ü1V%Âá,á)À7/ÕE~yY®æ—\ôVzu^„l `¸'œP.
but then the command gzuncompress($stringCompressed); doesn't work because it prints me out an empty xml file (with only one row)