dongxun7962 2015-03-28 18:16
浏览 30

访问数据存储区时如何避免配额恐慌的“安全性”? (启用计费)

I deployed my site to Google App Engine (using Golang and datastore with 1000 records). billing is enabled and a daily budget established. The Quota Details page indicates everything is under quota. I am doing an Urlfetch to obtain a tsv file that I use to build data entities in the datastore.

Two problems:

  1. Only 778 entities are create - log indicates it is a long running process but it appears to terminate prematurely without error message. Docs say this is normal
  2. The second step involves creating a json file from the entities in the datastore. This process causes a "Panic: overquota" because the process is taking too long I suppose.

How do I proceed? Should I divide the tsv datafile into several smaller files? Can I request "more time" so I don't go over the safety quotas?

Important to note is that the datastore part of the developers console is showing some problems: Although my application has access to 778 datastore entities, the console only reports 484 entities of that kind with a total of only 704 entities of all kinds (actually are 933)

I've been working at this for a while and am wondering if there is something going on with the system or are there things I can do to get my data entities set up properly. I also wish I could find more to read about safety quotas... ... and get the remote api working! thanks!

  • 写回答

1条回答 默认 最新

  • doucheng4094 2015-03-29 15:22
    关注

    It really depends on where you are doing this processing for both of these use cases within the appengine platform.

    For example if you are performing a urlfetch for the file to process within a frontend instance then you have 60 seconds to do all this processing. App Engine requires that frontend instances respond to each request within 60 seconds.

    I'm making an assumption that this is what you are doing, as your request is being terminated. To get around this time restriction you should move this type of batch data processing to the taskqueue where each task is required to completed within 10 minutes.

    The same holds true for your reads. Either you need to look at how your reading data from the datastore or you need to batch it up with either deferred task or a pipeline.

    Do you have a snippet that you can share for how you are composing your json?

    评论

报告相同问题?

悬赏问题

  • ¥15 关于#Java#的问题,如何解决?
  • ¥15 加热介质是液体,换热器壳侧导热系数和总的导热系数怎么算
  • ¥15 想问一下树莓派接上显示屏后出现如图所示画面,是什么问题导致的
  • ¥100 嵌入式系统基于PIC16F882和热敏电阻的数字温度计
  • ¥15 cmd cl 0x000007b
  • ¥20 BAPI_PR_CHANGE how to add account assignment information for service line
  • ¥500 火焰左右视图、视差(基于双目相机)
  • ¥100 set_link_state
  • ¥15 虚幻5 UE美术毛发渲染
  • ¥15 CVRP 图论 物流运输优化