使用go将文件流传输到AWS S3

我想用较少的内存和数据直接将多部分/表单数据(大)文件流传输到AWS S3 文件磁盘占用的空间。 我该如何实现? 在线资源仅说明如何上传文件并将其本地存储在服务器上。</ p>
</ div>

展开原文

原文

I want to stream a multipart/form-data (large) file upload directly to AWS S3 with as little memory and file disk footprint as possible. How can I achieve this? Resources online only explain how to upload a file and store it locally on the server.

5个回答



您可以使用 minio-go :</ p>

  n,err:= s3Client.PutObject(” bucket-name“,” objectName“,object,size,” application / 八位字节流”)
</ code> </ pre>

PutObject()自动在内部进行分段上传。 示例 </ p>
</ DIV>

展开原文

原文

you can do this using minio-go :

n, err := s3Client.PutObject("bucket-name", "objectName", object, size, "application/octet-stream")

PutObject() automatically does multipart upload internally. Example

dsf4s5787
dsf4s5787 我认为这不是正确的答案,因为在这里我们无法控制零件,而AWS API使我们能够分别上载每个零件并发送初始/完成/中止上传命令。
12 个月之前 回复

You can use upload manager to stream the file and upload it, you can read comments in source code you can also configure params to set the part size, concurrency & max upload parts, below is a sample code for reference.

package main

import (
    "fmt"
    "os"

    "github.com/aws/aws-sdk-go/aws/credentials"

    "github.com/aws/aws-sdk-go/aws"
    "github.com/aws/aws-sdk-go/aws/session"
    "github.com/aws/aws-sdk-go/service/s3/s3manager"
)

var filename = "file_name.zip"
var myBucket = "myBucket"
var myKey = "file_name.zip"
var accessKey = ""
var accessSecret = ""

func main() {
    var awsConfig *aws.Config
    if accessKey == "" || accessSecret == "" {
        //load default credentials
        awsConfig = &aws.Config{
            Region: aws.String("us-west-2"),
        }
    } else {
        awsConfig = &aws.Config{
            Region:      aws.String("us-west-2"),
            Credentials: credentials.NewStaticCredentials(accessKey, accessSecret, ""),
        }
    }

    // The session the S3 Uploader will use
    sess := session.Must(session.NewSession(awsConfig))

    // Create an uploader with the session and default options
    //uploader := s3manager.NewUploader(sess)

    // Create an uploader with the session and custom options
    uploader := s3manager.NewUploader(sess, func(u *s3manager.Uploader) {
        u.PartSize = 5 * 1024 * 1024 // The minimum/default allowed part size is 5MB
        u.Concurrency = 2            // default is 5
    })

    //open the file
    f, err := os.Open(filename)
    if err != nil {
        fmt.Printf("failed to open file %q, %v", filename, err)
        return
    }
    //defer f.Close()

    // Upload the file to S3.
    result, err := uploader.Upload(&s3manager.UploadInput{
        Bucket: aws.String(myBucket),
        Key:    aws.String(myKey),
        Body:   f,
    })

    //in case it fails to upload
    if err != nil {
        fmt.Printf("failed to upload file, %v", err)
        return
    }
    fmt.Printf("file uploaded to, %s
", result.Location)
}
duanfan8699
duanfan8699 是的,它将被分流。 正确,文件将在完全上传后显示。
一年多之前 回复
doushi1964
doushi1964 谢谢你的回答。 如果我的文件小于5 MB,它仍将流传输到S3? 但是据我了解,此文件仅在完全上传后才会出现在S3上吗?
一年多之前 回复



另一个选择是使用傻瓜,然后将您的写入流传输到安装点。 goofys不会在本地缓冲内容,因此可以在大型文件上正常使用。</ p>
</ div>

展开原文

原文

Another option is to mount the S3 bucket with goofys and then stream your writes to the mountpoint. goofys does not buffer the content locally so it will work fine with large files.



我没有尝试过,但是如果我是id,请尝试分段上传选项。</ p>

< p>您可以阅读文档 multipartupload 。</ p>

此处</ a >是分段上传和分段上传中止的示例。</ p>
</ div>

展开原文

原文

I didn't try it but if i were you id try the multi part upload option .

you can read the doc multipartupload .

here is go example for multipart upload and multipart upload abort.

doubi4491
doubi4491 嗯,我似乎只能对我的身体使用ReaderSeeker,这暗示着不可能直接流式传输
4 年多之前 回复



Amazon有一个官方的Go软件包,用于将文件上传到S3。</ p>

http://docs.aws.amazon.com/sdk- for-go / api / service / s3 / s3manager / Uploader.html </ p>

他们还提供了一个在动态压缩文件的同时上传文件的示例。</ p>

https://github.com/aws/aws-sdk-go/wiki/common-examples#upload-an-arbitrarily-sized-stream-with-amazon-s3-upload- 经理 </ p>

不确定是否有帮助。 您的问题有点含糊。</ p>
</ div>

展开原文

原文

Amazon has an official Go package for uploading files to S3.

http://docs.aws.amazon.com/sdk-for-go/api/service/s3/s3manager/Uploader.html

They also have an example of uploading a file while zipping it on the fly.

https://github.com/aws/aws-sdk-go/wiki/common-examples#upload-an-arbitrarily-sized-stream-with-amazon-s3-upload-manager

Not sure if that'll help. Your question was kinda vague.

dongxi7704
dongxi7704 过时,链接断开
大约一年之前 回复
Csdn user default icon
上传中...
上传图片
插入图片
抄袭、复制答案,以达到刷声望分或其他目的的行为,在CSDN问答是严格禁止的,一经发现立刻封号。是时候展现真正的技术了!
其他相关推荐
将对象上传到AWS S3,而无需使用aws-sdk-go创建文件

<div class="post-text" itemprop="text"> <p>I am trying to upload an object to AWS S3 using golang sdk without needing to create a file in my system (trying to upload only the string). But I am having difficulties to accomplish that. Can anybody give me an example of how can I upload to AWS S3 without needing to create a file?</p> <p>AWS Example of how to upload a file:</p> <pre><code>// Creates a S3 Bucket in the region configured in the shared config // or AWS_REGION environment variable. // // Usage: // go run s3_upload_object.go BUCKET_NAME FILENAME func main() { if len(os.Args) != 3 { exitErrorf("bucket and file name required Usage: %s bucket_name filename", os.Args[0]) } bucket := os.Args[1] filename := os.Args[2] file, err := os.Open(filename) if err != nil { exitErrorf("Unable to open file %q, %v", err) } defer file.Close() // Initialize a session in us-west-2 that the SDK will use to load // credentials from the shared credentials file ~/.aws/credentials. sess, err := session.NewSession(&amp;aws.Config{ Region: aws.String("us-west-2")}, ) // Setup the S3 Upload Manager. Also see the SDK doc for the Upload Manager // for more information on configuring part size, and concurrency. // // http://docs.aws.amazon.com/sdk-for-go/api/service/s3/s3manager/#NewUploader uploader := s3manager.NewUploader(sess) // Upload the file's body to S3 bucket as an object with the key being the // same as the filename. _, err = uploader.Upload(&amp;s3manager.UploadInput{ Bucket: aws.String(bucket), // Can also use the `filepath` standard library package to modify the // filename as need for an S3 object key. Such as turning absolute path // to a relative path. Key: aws.String(filename), // The file to be uploaded. io.ReadSeeker is preferred as the Uploader // will be able to optimize memory when uploading large content. io.Reader // is supported, but will require buffering of the reader's bytes for // each part. Body: file, }) if err != nil { // Print the error and exit. exitErrorf("Unable to upload %q to %q, %v", filename, bucket, err) } fmt.Printf("Successfully uploaded %q to %q ", filename, bucket) } </code></pre> <p>I already tried to create the file programmatically but it is creating the file on my system and then uploading it to S3.</p> </div>

如何在Golang中从AWS S3获取资源URL

<div class="post-text" itemprop="text"> <p>I need to get public permanent (not signed) URL of a resource using golang and <a href="https://github.com/aws/aws-sdk-go" rel="nofollow noreferrer">official aws go sdk</a>. In Java AWS S3 SDK there's a method called <a href="http://docs.aws.amazon.com/AWSJavaSDK/latest/javadoc/com/amazonaws/services/s3/AmazonS3Client.html#getResourceUrl-java.lang.String-java.lang.String-" rel="nofollow noreferrer"><code>getResourceUrl()</code></a> what's the equivalent in go?</p> </div>

使用Php和Jquery将文件上传到AWS S3时显示进度条

<div class="post-text" itemprop="text"> <p>I am using <strong>AWS S3</strong> to store my web application's file like videos and images, the framework of application was <strong>CodeIgniter</strong>.</p> <p>The flow of my system to upload file is like this: <strong>AJAX &gt; PHP &gt; AWS</strong></p> <p>I can successfully upload the file to a bucket by using Php SDK provided by AWS, but I had a problem while displaying a progress bar.</p> <p>Using <strong>xhr event listener</strong> to show progress bar is only tracking the upload progress of local server, is not for the AWS S3 upload status. This will become even progress bar is 100% but AWS S3 haven't completed yet.</p> <p>So my question is <strong>how to track the AWS S3 upload progress percentage and display it at html code?</strong></p> <p>I hope can display the progress status like:</p> <p><strong><em>15% of 100% uploaded</em></strong> ( &lt;&lt; This will live update )</p> <p>Thanks for all.</p> <pre><code>** Controller ** public function Ajax_uploadmedia() { if($_SERVER['REQUEST_METHOD'] === 'POST') { if(empty($_FILES['video_attach']['name'])){ echo 'Video File cannot be empty'; exit(); } $fileExtension = get_file_ext($_FILES['video_attach']['name']); $newfilename = uniqid(); $config['upload_path'] = UPLOAD_PATH.'/videos/'; $config['allowed_types'] = 'mp4|mov'; $config['file_name'] = $newfilename; $this-&gt;load-&gt;library('upload', $config); if($this-&gt;upload-&gt;do_upload('video_attach')) { $uploadData = $this-&gt;upload-&gt;data(); // AWS S3 Upload Process $this-&gt;load-&gt;library('Awslib'); $objAwsS3Client = $this-&gt;awslib-&gt;initial(); $uploadedFilePath =(UPLOAD_PATH.'/videos/'.$uploadData['file_name']); $objAwsS3Client-&gt;putObject(array( 'Bucket' =&gt; 'gpstudysys', 'Key' =&gt; 'videos/'.$uploadData['file_name'], 'Body' =&gt; fopen($uploadedFilePath, 'r'), 'ACL' =&gt; 'public-read' )); } } } </code></pre> <p><strong>Jquery Ajax Request</strong></p> <pre><code> $('#update_media_form').on('submit', function(e){ e.preventDefault(); var formData = new FormData($(this)[0]); $.ajax({ xhr: function() { var xhr = new window.XMLHttpRequest(); xhr.upload.addEventListener("progress", function(event) { if (event.lengthComputable) { var percentComplete = (event.loaded / event.total)*100; var percentDisplay = Math.round(percentComplete); $('#percent_progress').text(percentDisplay+'%'); $('#percent_progress').attr('aria-valuenow', percentDisplay); $('#percent_progress').attr('style', 'width: '+percentDisplay+'%'); } }, false); return xhr; }, type: 'POST', url: '&lt;?php echo base_url(); ?&gt;url_to/ajax/upload_media', cache: false, contentType: false, processData: false, data: formData, //dataType: 'json', complete:function(){ console.log("Request finished."); }, success: function(response){ // callback(response); } }); }); </code></pre> </div>

使用Golang的AWS S3并行下载

<div class="post-text" itemprop="text"> <p>I am writing a function to download a large file (9GB) from AWS S3 bucket using aws-sdk for go. I need to optimize this and download the file quickly. </p> <pre><code>func DownloadFromS3Bucket(bucket, item, path string) { os.Setenv("AWS_ACCESS_KEY_ID", constants.AWS_ACCESS_KEY_ID) os.Setenv("AWS_SECRET_ACCESS_KEY", constants.AWS_SECRET_ACCESS_KEY) file, err := os.Create(filepath.Join(path, item)) if err != nil { fmt.Printf("Error in downloading from file: %v ", err) os.Exit(1) } defer file.Close() sess, _ := session.NewSession(&amp;aws.Config{ Region: aws.String(constants.AWS_REGION)}, ) downloader := s3manager.NewDownloader(sess) numBytes, err := downloader.Download(file, &amp;s3.GetObjectInput{ Bucket: aws.String(bucket), Key: aws.String(item), }) if err != nil { fmt.Printf("Error in downloading from file: %v ", err) os.Exit(1) } fmt.Println("Download completed", file.Name(), numBytes, "bytes") } </code></pre> <p>Can someone suggest a solution to extend this function.</p> </div>

AWS S3-如何递归删除对象[GoLang]

<div class="post-text" itemprop="text"> <p>I would like to delete all <code>.JPEG</code> files from specified path at S3 bucket. For example, lets say that I have structure on S3 cloud service similar to following:</p> <pre><code>Obj1/ Obj2/ Obj3/ image_1.jpeg ... image_N.jpeg </code></pre> <p>Is it possible to specify <code>Obj1/Obj2/Obj3</code> as <code>DeleteObjectsInput's</code> <code>prefix</code> and recursively delete all <code>.JPEG</code> files that contain such <code>prefix</code>.</p> <p>Here is my code:</p> <pre><code>func (s3Obj S3) Delete() error { sess := session.Must(session.NewSession(&amp;aws.Config{ Region: aws.String(s3Obj.Region), })) svc := s3.New(sess) input := &amp;s3.DeleteObjectsInput{ Bucket: aws.String(s3Obj.Bucket), Delete: &amp;s3.Delete{ Objects: []*s3.ObjectIdentifier{ { Key: aws.String(s3Obj.ItemPath), }, }, Quiet: aws.Bool(false), }, } result, err := svc.DeleteObjects(input) if err != nil { if aerr, ok := err.(awserr.Error); ok { switch aerr.Code() { default: glog.Errorf("Error occurred while trying to delete object from S3. Error message - %v", aerr.Error()) } } else { glog.Errorf("Error occurred while trying to delete object from S3. Error message - %v", err.Error()) } return err } glog.Info(result) return nil } </code></pre> <p><code>sObj3.ItemPath</code> represents <code>Obj1/Obj2/Obj3</code> path from example above. As a result of this function I do not get any error. I actually get the following message:</p> <pre><code>Deleted: [{ Key: "Obj1/Obj2/Obj3" }] </code></pre> <p>But when I check my S3 cloud service, nothing is done. What am I doing wrong?</p> <p><strong>EDIT</strong></p> <p>I've changed my code so my Delete function accepts list of objects from which I make a list of <code>s3.ObjectIdentifier</code>. There is roughly 50 .JPEG files in that list, and for some reason following code <strong>ONLY DELETES LAST ONE</strong>. I am not sure why.</p> <pre><code>func (s3Obj S3) Delete(objects []string) error { sess := session.Must(session.NewSession(&amp;aws.Config{ Region: aws.String(s3Obj.Region), })) svc := s3.New(sess) var objKeys = make([]*s3.ObjectIdentifier, len(objects)) for i, v := range objects { glog.Info("About to delete: ", v) objKeys[i] = &amp;s3.ObjectIdentifier{ Key: &amp;v, } } input := &amp;s3.DeleteObjectsInput{ Bucket: aws.String(s3Obj.Bucket), Delete: &amp;s3.Delete{ Objects: objKeys, Quiet: aws.Bool(false), }, } result, err := svc.DeleteObjects(input) if err != nil { if aerr, ok := err.(awserr.Error); ok { switch aerr.Code() { default: glog.Errorf("Error occurred while trying to delete object from S3. Error message - %v", aerr.Error()) } } else { glog.Errorf("Error occurred while trying to delete object from S3. Error message - %v", err.Error()) } return err } glog.Info(result) return nil } </code></pre> </div>

使用php将文件上传到aws s3存储桶时出错

<div class="post-text" itemprop="text"> <pre><code>&lt;?php require_once(APPPATH.'libraries/REST_Controller.php'); require 'vendor/autoload.php'; use Aws\S3\S3Client; use Aws\S3\Exception\S3Exception; $s3_config = array( 'key' =&gt; '*************', 'secret' =&gt; '*****************************'); $s3 = S3Client::factory([ 'credentials' =&gt; [ 'key' =&gt; $s3_config['key'], 'secret' =&gt; $s3_config['secret'],], 'region' =&gt; 'ap-northeast-2', 'version' =&gt; 'latest']); function uploadS3($bucket, $dir, $file) { $key = md5(uniqid()); $type = mime_content_type($file['tmp_name']); $ext = explode('/', $type); $ext = $ext[sizeof($ext) - 1]; $file_name = "{$key}.{$ext}"; $file_path = "./files/{$file_name}"; move_uploaded_file($file['tmp_name'], $file_path); $save_path = null; try { $result = $GLOBALS['s3']-&gt;putObject([ 'Bucket' =&gt; $bucket, 'Key' =&gt; "{$dir}/{$file_name}", 'Body' =&gt; fopen($file_path, 'rb'), 'ACL' =&gt; 'public-read']); $save_path = $result['ObjectURL']; } catch (S3Exception $e) { // return null; } unlink($file_path); return $save_path; } ?&gt; </code></pre> <p>This is my code I have key and secret also.</p> <p>I make this and try to upload Image File.</p> <pre><code>if (isset($_FILES['ImageOneAdd'])) { $ImageOneAdd = uploadS3('testbucket','image',$_FILES['ImageOneAdd']); } </code></pre> <p>but in postman, that returns this.</p> <pre><code>{ "status": false, "error": { "classname": "InvalidArgumentException", "message": "Found 1 error while validating the input provided for the PutObject operation: [Body] must be an fopen resource, a GuzzleHttp\\Stream\\StreamInterface object, or something that can be cast to a string. Found bool(false)" } } </code></pre> <p>I don't know why this problems occur.</p> <p>I just do it with aws s3 upload php API;</p> <p>If anyone look at this code, and any wrong for that, please help me</p> <hr> <p>yes I chagne this code,</p> <pre><code>try { $result = $GLOBALS['s3']-&gt;putObject([ 'Bucket' =&gt; $bucket, 'Key' =&gt; "{$dir}/{$file_name}", 'SourceFile' =&gt; $file_path, 'ACL' =&gt; 'public-read']); $save_path = $result['ObjectURL']; } catch (S3Exception $e) { // return null; } </code></pre> <p>but it occurs error with this.</p> <pre><code>{ "status": false, "error": { "classname": "RuntimeException", "message": "Unable to open ./files/40c0a29b0599204c147745116554af59.jpeg using mode r: fopen(./files/40c0a29b0599204c147745116554af59.jpeg): failed to open stream: No such file or directory" } } </code></pre> </div>

AWS S3上传的文件未显示

<div class="post-text" itemprop="text"> <p>I am using golang sdk to upload files to a bucket on amazon S3.The response gives no error and the file is uploaded successfully. My problem is when I am listing the objects of the bucket in the same region in which I have uploaded a new <code>.jpg</code> file. I am getting same files list with no new file added to the Bucket. I don't know what I am doing wrong. This is the code that I am using to list objects inside the bucket.</p> <pre><code>input := &amp;s3.ListObjectsInput{ Bucket: aws.String("Bucket Name"), } result2, err := svc.ListObjects(input) if err != nil { if aerr, ok := err.(awserr.Error); ok { switch aerr.Code() { case s3.ErrCodeNoSuchBucket: fmt.Println(s3.ErrCodeNoSuchBucket, aerr.Error()) default: fmt.Println(aerr.Error()) } } else { // Print the error, cast err to awserr.Error to get the Code and // Message from an error. fmt.Println(err.Error()) } return } log.Println("Bucket List", result2) </code></pre> </div>

如何从AWS S3将文件流式传输到Zip中

<div class="post-text" itemprop="text"> <p>I'm using <a href="https://flysystem.thephpleague.com/" rel="nofollow noreferrer"><em>the PHP Flysystem</em></a> package to stream content from my <em>AWS S3</em> bucket. In particular, I'm using <a href="http://flysystem.thephpleague.com/api/#using-streams-for-reads-and-writes" rel="nofollow noreferrer"><code>$filesystem-&gt;readStream</code></a>.</p> <p><strong>My Question</strong></p> <p>When I stream a file, it ends up in <em>myzip.zip</em> and the size is correct, but when unzip it, it become <em>myzip.zip.cpgz</em>. Here is my prototype:</p> <pre><code>header('Pragma: no-cache'); header('Content-Description: File Download'); header('Content-disposition: attachment; filename="myZip.zip"'); header('Content-Type: application/octet-stream'); header('Content-Transfer-Encoding: binary'); $s3 = Storage::disk('s3'); // Laravel Syntax echo $s3-&gt;readStream('directory/file.jpg'); </code></pre> <p>What am I doing wrong?</p> <p><strong>Side Question</strong></p> <p>When I stream a file like this, does it:</p> <ol> <li>get fully downloaded into my server's RAM, then get transferred to the client, or</li> <li>does it get saved - in chunks - in the buffer, and then get transferred to the client?</li> </ol> <p>Basically, is my server being burdened if I have have dozens of GB's of data being streamed?</p> </div>

通过AWS开发工具包GO将压缩文件流式传输到S3

<div class="post-text" itemprop="text"> <p>I followed the example on the AWS site for gzipping files and streaming them to S3, found here: <a href="http://docs.aws.amazon.com/sdk-for-go/latest/v1/developerguide/common-examples.title.html" rel="nofollow">http://docs.aws.amazon.com/sdk-for-go/latest/v1/developerguide/common-examples.title.html</a></p> <p>I am having an issue where the only thing landing in my S3 bucket are files with basically just the GZIP headers. Every single file is 23b in size.</p> <p>Any idea what would cause this?</p> <p>My code:</p> <pre><code>func (t *Table) Upload() { year := time.Now().Format("2006") month := time.Now().Format("01") day := time.Now().Format("02") reader, writer := io.Pipe() go func() { gw := gzip.NewWriter(writer) io.Copy(gw, t.File) t.File.Close() gw.Close() writer.Close() }() uploader := s3manager.NewUploader(session.New(&amp;aws.Config{Region: aws.String(os.Getenv("AWS_REGION"))})) result, err := uploader.Upload(&amp;s3manager.UploadInput{ Body: reader, Bucket: aws.String(os.Getenv("S3_BUCKET")), Key: aws.String(fmt.Sprintf("%s/%s/%s/%s/%s", os.Getenv("S3_KEY"), year, month, day, t.Name+".csv.gz")), }) if err != nil { log.WithField("error", err).Fatal("Failed to upload file.") } log.WithField("location", result.Location).Info("Successfully uploaded to") } </code></pre> </div>

从AWS S3下载文件并在浏览器中使用其他名称下载,在PHP中?

<div class="post-text" itemprop="text"> <p>I save documents uploaded from a website in Amazon's S3. I store the file with a unique hash, to eliminate the possibility of duplicates.</p> <p>I can download the files to the server with the correct filename. </p> <p>How do I download the files to the users browser instead of the server? I use Donovan Schonknecht's S3 library and I use the S3::getObject to read the file.</p> <p>Other possibility, is it possible to link a URL like <a href="https://s3.amazonaws.com/myfolder/d_b5592038c76db88c4c6113d1fb166fe8e9b1b7b3.pdf" rel="nofollow">https://s3.amazonaws.com/myfolder/d_b5592038c76db88c4c6113d1fb166fe8e9b1b7b3.pdf</a> and download it to a browser as myfile.pdf?</p> <p>I don't want to download the file to the server and then to the user.</p> </div>

使用golang的http.ResponseWriter的AWS S3大文件反向代理

<div class="post-text" itemprop="text"> <p>I have a request handler named <code>Download</code> which I want to access a large file from Amazon S3 and push it to the user's browser. My goals are:</p> <ul> <li>To record some request information before granting the user access to the file</li> <li>To not buffer the file into memory too much. Files may become too large.</li> </ul> <p>Here is what I've explored so far:</p> <pre><code>func Download(w http.ResponseWriter, r *http.Request) { sess := session.New(&amp;aws.Config{ Region: aws.String("eu-west-1"), Endpoint: aws.String("s3-eu-west-1.amazonaws.com"), S3ForcePathStyle: aws.Bool(true), Credentials: cred, }) downloader := s3manager.NewDownloader(sess) // I can't write directly into the ResponseWriter. It doesn't implement WriteAt. // Besides, it doesn't seem like the right thing to do. _, err := downloader.Download(w, &amp;s3.GetObjectInput{ Bucket: aws.String(BUCKET), Key: aws.String(filename), }) if err != nil { log.Error(4, err.Error()) return } } </code></pre> <p>I'm wondering if there isn't a better approach (given the goals I'm trying to achieve).</p> <p>Any suggestions are welcome. Thank you in advance :-)</p> </div>

使用AWS PHP SDK上传到AWS S3无法通过身份验证

<div class="post-text" itemprop="text"> <p>I am using version 3.67.5 of AWS SDK PHP to upload files to S3. The code is in an Heroku dyno.</p> <p>I've created the access keys, AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY, and set them as configuration variables in the Heroku instance. I know that they are there, because with getenv() I 've outputted them.</p> <p>I'm using the following code:</p> <pre><code>try { //Create a S3Client $s3Client = new S3Client([ 'profile' =&gt; 'default', 'region' =&gt; 'some-region', 'version' =&gt; 'latest' ]); $result = $s3Client-&gt;putObject([ 'Bucket' =&gt; 'some-bucket', 'Key' =&gt; $fileName, 'SourceFile' =&gt; $this-&gt;getParameter('photos_directory') . $fileName, ]); } catch (AwsException $e) { echo $e-&gt;getMessage() . " "; } </code></pre> <p>Still, uploading a file, it shows the following error:</p> <pre><code>request.CRITICAL: Uncaught PHP Exception Aws\Exception\CredentialsException: "Cannot read credentials from /app/.aws/credentials" </code></pre> <p>I've tested passing a new element 'credentials' in the constructor, like they do in theirs <a href="https://github.com/aws/aws-sdk-php/blob/3.67.5/tests/S3/S3ClientTest.php" rel="nofollow noreferrer">unit tests</a>, and the error is the same.</p> </div>

设置从远程aws s3文件读取SplFileObject的超时

<div class="post-text" itemprop="text"> <p>I am reading a file line by line directly from aws s3 server using SplFileObject.</p> <pre><code>$url = "s3://{$bucketName}/{$fileName}"; $fileHandle = new \SplFileObject($url, 'r'); $fileHandle-&gt;setFlags(SplFileObject::READ_AHEAD); while(!$fileHandle-&gt;eof()){ $line = $fileHandle-&gt;current(); // process the line $fileHandle-&gt;next(); } </code></pre> <p>That works perfectly fine in 99% of the cases except at times when loop is running and there is a temporary network glitch. Script being unable to access the next line from s3 for x seconds exits prematurely. And the problem is you never know if the script completed its job or it exited due to timeout.</p> <p>My question here is </p> <p>1- is there a way to explicitly set timeout on SPLFileObject while accessing remote file so that when the loop exits, I can understand if it exited due to time out or if the file really reached the eof.</p> <p>I checked the <a href="http://php.net/manual/en/function.stream-set-blocking.php" rel="nofollow noreferrer">stream_set_blocking</a> and <a href="http://php.net/manual/en/function.stream-set-timeout.php" rel="nofollow noreferrer">stream_set_timeout</a> but they both do not seem to work with SplFileObject.</p> <p>2- What is the timeout setting that this script is currently following? is it socket_timeout ? or stream_timeout or curl timeout? or simply php script timeout (which is highly unlikely I guess as the script runs on command line ).</p> <p>Any hints?</p> </div>

在shell脚本中使用aws s3命令并从php调用

<div class="post-text" itemprop="text"> <p>I am having challenge with aws s3 commands using PHP. Here is the my requirements. </p> <p>Write a shell script to download few files s3 bucket to local system using aws s3 cp. </p> <p><strong>Shell Script (1.sh)</strong></p> <pre><code>#!/bin/bash aws s3 cp s3://bucket1/1.mp4 /tmp/1.mp4 </code></pre> <p>When I run the above shell script (sh 1.sh or ./1.sh) the file downloaded with out any issues.</p> <p>Now question is, I want to run the above shell script from PHP. Since the file name passing from from PHP. So, here is my php code.</p> <p><strong>PHP Script</strong></p> <pre><code>&lt;?php $out = exec("sh /var/cronjobs/1.sh 2&gt;&amp;1"); echo $out; ?&gt; </code></pre> <p>I am getting error like aws command not found. If I give full path of aws then getting a permission issue. I've already given full permission.</p> </div>

使用AWS PHP SDK列出AWS S3 Bucket中文件夹(前缀)内的所有文件

<div class="post-text" itemprop="text"> <p>I'm having some troubles listing all the files within certain prefix of an AWS S3 Bucket.</p> <p>I have searched and search, but only seem to be able to find the same information:</p> <pre><code>try { $objects = $s3Client-&gt;getIterator('ListObjects', array( 'Bucket' =&gt; 'BUCKETNAME', 'Prefix' =&gt; '/uploads/content/document/' )); foreach ($objects as $object) { echo $object['Key'] . " "; } } catch (S3Exception $e) { echo $e-&gt;getMessage() . " "; } </code></pre> <p>This will give back a fair bit of information if I <code>print_r($objects)</code> however, nothing is echo'd out in the loop, and no matter what I try, I can't seem to get it to list any files.</p> <p>I actually have another 6 folders I want to retrieve a list of files from, however, obviously need to get it working with just <code>document</code> for now.</p> <p>I have tried removing the final / on the prefix, as well as using '<code>listObjects</code>':</p> <pre><code>$s3Client-&gt;getIterator('listObjects' ... </code></pre> <p>and even with trying a different method of just <code>ListObjects</code> or <code>ListObjectsV2</code> (found this information). The first one works but again doesn't list any files, and the second method isn't found.</p> <pre><code>$s3Client-&gt;ListObjects(array( ... $s3Client-&gt;ListObjectsV2(array( ... </code></pre> <p>No doubt i'm missing something. All I want to achieve is the list of all files names within each of the folders/prefixes, so I can list these and use some clever jQuery to add them into a document when selected in our CMS, sure there should be more information on this somewhere!!!</p> <p>Any help would be much appreciated.</p> <p><code>print_r($objects)</code> returns: <a href="http://pastebin.com/YwWLLsSK" rel="nofollow noreferrer">http://pastebin.com/YwWLLsSK</a></p> </div>

AWS S3文件管理器目录

<div class="post-text" itemprop="text"> <p>This code is used to list the directory from FTP folder. Does anyone know how to change the code to access the AWS S3 bucket directory? Because of that, AWD S3 is unable to use a glob function to list down the folder list.</p> <pre class="lang-php prettyprint-override"><code>public function directory() { $json = array(); if (isset($this-&gt;request-&gt;post['directory'])) { $directories = glob(rtrim(DIR_IMAGE . 'data/' . str_replace('../', '', $this-&gt;request-&gt;post['directory']), '/') . '/*', GLOB_ONLYDIR); if ($directories) { $i = 0; foreach ($directories as $directory) { $json[$i]['data'] = basename($directory); $json[$i]['attributes']['directory'] = utf8_substr($directory, strlen(DIR_IMAGE . 'data/')); $children = glob(rtrim($directory, '/') . '/*', GLOB_ONLYDIR); if ($children) { $json[$i]['children'] = ' '; } $i++; } } } $this-&gt;response-&gt;setOutput(json_encode($json)); } </code></pre> </div>

确定AWS S3中文件的图像大小而不下载图像

<div class="post-text" itemprop="text"> <p>I have some images (jpg, png) uploaded to aws s3 bucket. I want to extract some informations (lambda is written in golang) from the image (width and height). Is it possible to do this without downloading the image?</p> </div>

使用AWS开发工具包Go的完整URI从S3下载文件

<div class="post-text" itemprop="text"> <p>The examples I've seen for downloading a file from S3 using the AWS SDK for Go are of the form:</p> <pre><code>downloader := s3manager.NewDownloader(session, /* other args */) s3object := &amp;s3.GetObjectInput{ Bucket: aws.String(myBucket), Key: aws.String(myKey), } bytesDownloaded, err := downloader.Download(myFile, s3object) </code></pre> <p>That is, one uses the bucket and key to specify the file. But what if I already have the full URI of the file on S3? E.g.:</p> <p><code>https://s3.us-west-2.amazonaws.com/myBucket/myKey</code></p> <p>Is there a way using the SDK to specify the file to download using the URL directly?</p> <p>No, the bucket is <em>not</em> public. In the SDK, I'm also setting the access and secret keys (elided from the example code).</p> <p>Lastly, if it's simply not possible to do what I'm asking via the SDK, that's an acceptable (though not desired) answer.</p> </div>

下载时重命名文件,但不适用于AWS S3

<div class="post-text" itemprop="text"> <p>I have this code that rename a file name when downloading but is not working for files from AWS S3</p> <p>code:</p> <pre><code> &lt;a href="https://gzfiles.s3.amazonaws.com/1450501725_test1.mp3" download="file.mp3"&gt;DOWNLOAD&lt;/a&gt; </code></pre> <p>It works when the href is from another server but not from AWS S3.</p> </div>

linux下利用/proc进行进程树的打印

在linux下利用c语言实现的进程树的打印,主要通过/proc下的目录中的进程文件,获取status中的进程信息内容,然后利用递归实现进程树的打印

设计模式(JAVA语言实现)--20种设计模式附带源码

课程亮点: 课程培训详细的笔记以及实例代码,让学员开始掌握设计模式知识点 课程内容: 工厂模式、桥接模式、组合模式、装饰器模式、外观模式、享元模式、原型模型、代理模式、单例模式、适配器模式 策略模式、模板方法模式、观察者模式、迭代器模式、责任链模式、命令模式、备忘录模式、状态模式、访问者模式 课程特色: 笔记设计模式,用笔记串连所有知识点,让学员从一点一滴积累,学习过程无压力 笔记标题采用关键字标识法,帮助学员更加容易记住知识点 笔记以超链接形式让知识点关联起来,形式知识体系 采用先概念后实例再应用方式,知识点深入浅出 提供授课内容笔记作为课后复习以及工作备查工具 部分图表(电脑PC端查看):

Python数据分析与挖掘

92讲视频课+16大项目实战+源码+¥800元课程礼包+讲师社群1V1答疑+社群闭门分享会=99元 &nbsp; 为什么学习数据分析? &nbsp; &nbsp; &nbsp; 人工智能、大数据时代有什么技能是可以运用在各种行业的?数据分析就是。 &nbsp; &nbsp; &nbsp; 从海量数据中获得别人看不见的信息,创业者可以通过数据分析来优化产品,营销人员可以通过数据分析改进营销策略,产品经理可以通过数据分析洞察用户习惯,金融从业者可以通过数据分析规避投资风险,程序员可以通过数据分析进一步挖掘出数据价值,它和编程一样,本质上也是一个工具,通过数据来对现实事物进行分析和识别的能力。不管你从事什么行业,掌握了数据分析能力,往往在其岗位上更有竞争力。 &nbsp;&nbsp; 本课程共包含五大模块: 一、先导篇: 通过分析数据分析师的一天,让学员了解全面了解成为一个数据分析师的所有必修功法,对数据分析师不在迷惑。 &nbsp; 二、基础篇: 围绕Python基础语法介绍、数据预处理、数据可视化以及数据分析与挖掘......这些核心技能模块展开,帮助你快速而全面的掌握和了解成为一个数据分析师的所有必修功法。 &nbsp; 三、数据采集篇: 通过网络爬虫实战解决数据分析的必经之路:数据从何来的问题,讲解常见的爬虫套路并利用三大实战帮助学员扎实数据采集能力,避免没有数据可分析的尴尬。 &nbsp; 四、分析工具篇: 讲解数据分析避不开的科学计算库Numpy、数据分析工具Pandas及常见可视化工具Matplotlib。 &nbsp; 五、算法篇: 算法是数据分析的精华,课程精选10大算法,包括分类、聚类、预测3大类型,每个算法都从原理和案例两个角度学习,让你不仅能用起来,了解原理,还能知道为什么这么做。

广工操作系统课程设计(文档+代码+可执行文件)

实现作业调度(先来先服务)、进程调度功能(时间片轮转) 实现内存管理功能(连续分配)。 实现文件系统功能(选作) 这些功能要有机地连接起来

Only老K说-爬取妹子图片(简单入门)

安装第三方请求库 requests 被网站禁止了访问 原因是我们是Python过来的 重新给一段 可能还是存在用不了,使用网页的 编写代码 上面注意看匹配内容 User-Agent:请求对象 AppleWebKit:请求内核 Chrome浏览器 //请求网页 import requests import re //正则表达式 就是去不规则的网页里面提取有规律的信息 headers = { 'User-Agent':'存放浏览器里面的' } response = requests.get

linux“开发工具三剑客”速成攻略

工欲善其事,必先利其器。Vim+Git+Makefile是Linux环境下嵌入式开发常用的工具。本专题主要面向初次接触Linux的新手,熟练掌握工作中常用的工具,在以后的学习和工作中提高效率。

Python代码实现飞机大战

文章目录经典飞机大战一.游戏设定二.我方飞机三.敌方飞机四.发射子弹五.发放补给包六.主模块 经典飞机大战 源代码以及素材资料(图片,音频)可从下面的github中下载: 飞机大战源代码以及素材资料github项目地址链接 ————————————————————————————————————————————————————————— 不知道大家有没有打过飞机,喜不喜欢打飞机。当我第一次接触这个东西的时候,我的内心是被震撼到的。第一次接触打飞机的时候作者本人是身心愉悦的,因为周边的朋友都在打飞机, 每

Python数据清洗实战入门

本次课程主要以真实的电商数据为基础,通过Python详细的介绍了数据分析中的数据清洗阶段各种技巧和方法。

2019 Python开发者日-培训

本次活动将秉承“只讲技术,拒绝空谈”的理念,邀请十余位身处一线的Python技术专家,重点围绕Web开发、自动化运维、数据分析、人工智能等技术模块,分享真实生产环境中使用Python应对IT挑战的真知灼见。此外,针对不同层次的开发者,大会还安排了深度培训实操环节,为开发者们带来更多深度实战的机会。

apache-jmeter-5.1.1(Requires Java 8+).zip

。Apache JMeter 5.1.1 (Requires Java 8+),需要jdk8以上的版本。

数通HCNP中文理论全套教材.rar

内涵HCNP-IENP中文理论书-内文,

Python可以这样学(第四季:数据分析与科学计算可视化)

董付国老师系列教材《Python程序设计(第2版)》(ISBN:9787302436515)、《Python可以这样学》(ISBN:9787302456469)配套视频,在教材基础上又增加了大量内容,通过实例讲解numpy、scipy、pandas、statistics、matplotlib等标准库和扩展库用法。

Java基础知识面试题(2020最新版)

文章目录Java概述何为编程什么是Javajdk1.5之后的三大版本JVM、JRE和JDK的关系什么是跨平台性?原理是什么Java语言有哪些特点什么是字节码?采用字节码的最大好处是什么什么是Java程序的主类?应用程序和小程序的主类有何不同?Java应用程序与小程序之间有那些差别?Java和C++的区别Oracle JDK 和 OpenJDK 的对比基础语法数据类型Java有哪些数据类型switc...

我以为我对Mysql事务很熟,直到我遇到了阿里面试官

太惨了,面试又被吊打

2019 AI开发者大会

2019 AI开发者大会(AI ProCon 2019)是由中国IT社区CSDN主办的AI技术与产业年度盛会。多年经验淬炼,如今蓄势待发:2019年9月6-7日,大会将有近百位中美顶尖AI专家、知名企业代表以及千余名AI开发者齐聚北京,进行技术解读和产业论证。我们不空谈口号,只谈技术,诚挚邀请AI业内人士一起共铸人工智能新篇章!

图书管理系统(Java + Mysql)我的第一个完全自己做的实训项目

图书管理系统 Java + MySQL 完整实训代码,MVC三层架构组织,包含所有用到的图片资源以及数据库文件,大三上学期实训,注释很详细,按照阿里巴巴Java编程规范编写

Python数据挖掘简易入门

&nbsp; &nbsp; &nbsp; &nbsp; 本课程为Python数据挖掘方向的入门课程,课程主要以真实数据为基础,详细介绍数据挖掘入门的流程和使用Python实现pandas与numpy在数据挖掘方向的运用,并深入学习如何运用scikit-learn调用常用的数据挖掘算法解决数据挖掘问题,为进一步深入学习数据挖掘打下扎实的基础。

C/C++学习指南全套教程

C/C++学习的全套教程,从基本语法,基本原理,到界面开发、网络开发、Linux开发、安全算法,应用尽用。由毕业于清华大学的业内人士执课,为C/C++编程爱好者的教程。

微信公众平台开发入门

本套课程的设计完全是为初学者量身打造,课程内容由浅入深,课程讲解通俗易懂,代码实现简洁清晰。通过本课程的学习,学员能够入门微信公众平台开发,能够胜任企业级的订阅号、服务号、企业号的应用开发工作。 通过本课程的学习,学员能够对微信公众平台有一个清晰的、系统性的认识。例如,公众号是什么,它有什么特点,它能做什么,怎么开发公众号。 其次,通过本课程的学习,学员能够掌握微信公众平台开发的方法、技术和应用实现。例如,开发者文档怎么看,开发环境怎么搭建,基本的消息交互如何实现,常用的方法技巧有哪些,真实应用怎么开发。

三个项目玩转深度学习(附1G源码)

从事大数据与人工智能开发与实践约十年,钱老师亲自见证了大数据行业的发展与人工智能的从冷到热。事实证明,计算机技术的发展,算力突破,海量数据,机器人技术等,开启了第四次工业革命的序章。深度学习图像分类一直是人工智能的经典任务,是智慧零售、安防、无人驾驶等机器视觉应用领域的核心技术之一,掌握图像分类技术是机器视觉学习的重中之重。针对现有线上学习的特点与实际需求,我们开发了人工智能案例实战系列课程。打造:以项目案例实践为驱动的课程学习方式,覆盖了智能零售,智慧交通等常见领域,通过基础学习、项目案例实践、社群答疑,三维立体的方式,打造最好的学习效果。

2021考研数学张宇基础30讲.pdf

张宇:博士,全国著名考研数学辅导专家,教育部“国家精品课程建设骨干教师”,全国畅销书《张宇高等数学18讲》《张宇线性代数9讲》《张宇概率论与数理统计9讲》《张宇考研数学题源探析经典1000题》《张宇考

专为程序员设计的数学课

<p> 限时福利限时福利,<span>15000+程序员的选择!</span> </p> <p> 购课后添加学习助手(微信号:csdn590),按提示消息领取编程大礼包!并获取讲师答疑服务! </p> <p> <br> </p> <p> 套餐中一共包含5门程序员必学的数学课程(共47讲) </p> <p> 课程1:《零基础入门微积分》 </p> <p> 课程2:《数理统计与概率论》 </p> <p> 课程3:《代码学习线性代数》 </p> <p> 课程4:《数据处理的最优化》 </p> <p> 课程5:《马尔可夫随机过程》 </p> <p> <br> </p> <p> 哪些人适合学习这门课程? </p> <p> 1)大学生,平时只学习了数学理论,并未接触如何应用数学解决编程问题; </p> <p> 2)对算法、数据结构掌握程度薄弱的人,数学可以让你更好的理解算法、数据结构原理及应用; </p> <p> 3)看不懂大牛代码设计思想的人,因为所有的程序设计底层逻辑都是数学; </p> <p> 4)想学习新技术,如:人工智能、机器学习、深度学习等,这门课程是你的必修课程; </p> <p> 5)想修炼更好的编程内功,在遇到问题时可以灵活的应用数学思维解决问题。 </p> <p> <br> </p> <p> 在这门「专为程序员设计的数学课」系列课中,我们保证你能收获到这些:<br> <br> <span> </span> </p> <p class="ql-long-24357476"> <span class="ql-author-24357476">①价值300元编程课程大礼包</span> </p> <p class="ql-long-24357476"> <span class="ql-author-24357476">②应用数学优化代码的实操方法</span> </p> <p class="ql-long-24357476"> <span class="ql-author-24357476">③数学理论在编程实战中的应用</span> </p> <p class="ql-long-24357476"> <span class="ql-author-24357476">④程序员必学的5大数学知识</span> </p> <p class="ql-long-24357476"> <span class="ql-author-24357476">⑤人工智能领域必修数学课</span> </p> <p> <br> 备注:此课程只讲程序员所需要的数学,即使你数学基础薄弱,也能听懂,只需要初中的数学知识就足矣。<br> <br> 如何听课? </p> <p> 1、登录CSDN学院 APP 在我的课程中进行学习; </p> <p> 2、登录CSDN学院官网。 </p> <p> <br> </p> <p> 购课后如何领取免费赠送的编程大礼包和加入答疑群? </p> <p> 购课后,添加助教微信:<span> csdn590</span>,按提示领取编程大礼包,或观看付费视频的第一节内容扫码进群答疑交流! </p> <p> <img src="https://img-bss.csdn.net/201912251155398753.jpg" alt=""> </p>

DDR5_Draft_Spec_Rev05c.pdf

DDR5 spec

Java面试史上最全的JAVA专业术语面试100问 (前1-50)

前言: 说在前面, 面试题是根据一些朋友去面试提供的,再就是从网上整理了一些。 先更新50道,下一波吧后面的也更出来。 求赞求关注!! 废话也不多说,现在就来看看有哪些面试题 1、面向对象的特点有哪些? 抽象、继承、封装、多态。 2、接口和抽象类有什么联系和区别? 3、重载和重写有什么区别? 4、java有哪些基本数据类型? 5、数组有没有length()方法?String有没有length()方法? 数组没有length()方法,它有length属性。 String有length()方法。 集合求长度用

网络工程师小白入门--【思科CCNA、华为HCNA等网络工程师认证】

本课程适合CCNA或HCNA网络小白同志,高手请绕道,可以直接学习进价课程。通过本预科课程的学习,为学习网络工程师、思科CCNA、华为HCNA这些认证打下坚实的基础! 重要!思科认证2020年2月24日起,已启用新版认证和考试,包括题库都会更新,由于疫情原因,请关注官网和本地考点信息。题库网络上很容易下载到。

C/C++跨平台研发从基础到高阶实战系列套餐

一 专题从基础的C语言核心到c++ 和stl完成基础强化; 二 再到数据结构,设计模式完成专业计算机技能强化; 三 通过跨平台网络编程,linux编程,qt界面编程,mfc编程,windows编程,c++与lua联合编程来完成应用强化 四 最后通过基于ffmpeg的音视频播放器,直播推流,屏幕录像,

Python界面版学生管理系统

前不久上传了一个控制台版本的学生管理系统,这个是Python界面版学生管理系统,这个是使用pycharm开发的一个有界面的学生管理系统,基本的增删改查,里面又演示视频和完整代码,有需要的伙伴可以自行下

2019数学建模A题高压油管的压力控制 省一论文即代码

2019数学建模A题高压油管的压力控制省一完整论文即详细C++和Matlab代码,希望对同学们有所帮助

4小时玩转微信小程序——基础入门与微信支付实战

这是一个门针对零基础学员学习微信小程序开发的视频教学课程。课程采用腾讯官方文档作为教程的唯一技术资料来源。杜绝网络上质量良莠不齐的资料给学员学习带来的障碍。 视频课程按照开发工具的下载、安装、使用、程序结构、视图层、逻辑层、微信小程序等几个部分组织课程,详细讲解整个小程序的开发过程

相关热词 c#中如何设置提交按钮 c#帮助怎么用 c# 读取合并单元格的值 c#带阻程序 c# 替换span内容 c# rpc c#控制台点阵字输出 c#do while循环 c#调用dll多线程 c#找出两个集合不同的
立即提问