Current File : //usr/lib/python3/dist-packages/s3transfer/__pycache__/processpool.cpython-312.pyc |
�
�<�e�� � �~ � d Z ddlZddlZddlZddlZddlZddlZddlmZ ddl Z
ddlmZ ddl
mZmZ ddlmZmZmZ ddlmZmZ ddlmZmZ dd lmZmZmZmZmZ ej@ e!� Z"d
Z# ejH dg d�� Z% ejH d
g d�� Z&ejN d� � Z(d� Z) G d� d� Z* G d� d� Z+ G d� de� Z, G d� de� Z- G d� d� Z. G d� d� Z/ G d� d� Z0 G d� d e� Z1e1je de/� G d!� d"ejf � Z4 G d#� d$e4� Z5 G d%� d&e4� Z6y)'aC Speeds up S3 throughput by using processes
Getting Started
===============
The :class:`ProcessPoolDownloader` can be used to download a single file by
calling :meth:`ProcessPoolDownloader.download_file`:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
with ProcessPoolDownloader() as downloader:
downloader.download_file('mybucket', 'mykey', 'myfile')
This snippet downloads the S3 object located in the bucket ``mybucket`` at the
key ``mykey`` to the local file ``myfile``. Any errors encountered during the
transfer are not propagated. To determine if a transfer succeeded or
failed, use the `Futures`_ interface.
The :class:`ProcessPoolDownloader` can be used to download multiple files as
well:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
with ProcessPoolDownloader() as downloader:
downloader.download_file('mybucket', 'mykey', 'myfile')
downloader.download_file('mybucket', 'myotherkey', 'myotherfile')
When running this snippet, the downloading of ``mykey`` and ``myotherkey``
happen in parallel. The first ``download_file`` call does not block the
second ``download_file`` call. The snippet blocks when exiting
the context manager and blocks until both downloads are complete.
Alternatively, the ``ProcessPoolDownloader`` can be instantiated
and explicitly be shutdown using :meth:`ProcessPoolDownloader.shutdown`:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
downloader = ProcessPoolDownloader()
downloader.download_file('mybucket', 'mykey', 'myfile')
downloader.download_file('mybucket', 'myotherkey', 'myotherfile')
downloader.shutdown()
For this code snippet, the call to ``shutdown`` blocks until both
downloads are complete.
Additional Parameters
=====================
Additional parameters can be provided to the ``download_file`` method:
* ``extra_args``: A dictionary containing any additional client arguments
to include in the
`GetObject <https://botocore.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.get_object>`_
API request. For example:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
with ProcessPoolDownloader() as downloader:
downloader.download_file(
'mybucket', 'mykey', 'myfile',
extra_args={'VersionId': 'myversion'})
* ``expected_size``: By default, the downloader will make a HeadObject
call to determine the size of the object. To opt-out of this additional
API call, you can provide the size of the object in bytes:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
MB = 1024 * 1024
with ProcessPoolDownloader() as downloader:
downloader.download_file(
'mybucket', 'mykey', 'myfile', expected_size=2 * MB)
Futures
=======
When ``download_file`` is called, it immediately returns a
:class:`ProcessPoolTransferFuture`. The future can be used to poll the state
of a particular transfer. To get the result of the download,
call :meth:`ProcessPoolTransferFuture.result`. The method blocks
until the transfer completes, whether it succeeds or fails. For example:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
with ProcessPoolDownloader() as downloader:
future = downloader.download_file('mybucket', 'mykey', 'myfile')
print(future.result())
If the download succeeds, the future returns ``None``:
.. code:: python
None
If the download fails, the exception causing the failure is raised. For
example, if ``mykey`` did not exist, the following error would be raised
.. code:: python
botocore.exceptions.ClientError: An error occurred (404) when calling the HeadObject operation: Not Found
.. note::
:meth:`ProcessPoolTransferFuture.result` can only be called while the
``ProcessPoolDownloader`` is running (e.g. before calling ``shutdown`` or
inside the context manager).
Process Pool Configuration
==========================
By default, the downloader has the following configuration options:
* ``multipart_threshold``: The threshold size for performing ranged downloads
in bytes. By default, ranged downloads happen for S3 objects that are
greater than or equal to 8 MB in size.
* ``multipart_chunksize``: The size of each ranged download in bytes. By
default, the size of each ranged download is 8 MB.
* ``max_request_processes``: The maximum number of processes used to download
S3 objects. By default, the maximum is 10 processes.
To change the default configuration, use the :class:`ProcessTransferConfig`:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
from s3transfer.processpool import ProcessTransferConfig
config = ProcessTransferConfig(
multipart_threshold=64 * 1024 * 1024, # 64 MB
max_request_processes=50
)
downloader = ProcessPoolDownloader(config=config)
Client Configuration
====================
The process pool downloader creates ``botocore`` clients on your behalf. In
order to affect how the client is created, pass the keyword arguments
that would have been used in the :meth:`botocore.Session.create_client` call:
.. code:: python
from s3transfer.processpool import ProcessPoolDownloader
from s3transfer.processpool import ProcessTransferConfig
downloader = ProcessPoolDownloader(
client_kwargs={'region_name': 'us-west-2'})
This snippet ensures that all clients created by the ``ProcessPoolDownloader``
are using ``us-west-2`` as their region.
� N)�deepcopy)�Config)�MAXINT�BaseManager)�ALLOWED_DOWNLOAD_ARGS�MB�PROCESS_USER_AGENT)�CancelledError�RetriesExceededError)�BaseTransferFuture�BaseTransferMeta)�S3_RETRYABLE_DOWNLOAD_ERRORS�CallArgs�OSUtils�calculate_num_parts�calculate_range_parameter�SHUTDOWN�DownloadFileRequest��transfer_id�bucket�key�filename�
extra_args�
expected_size�GetObjectJob)r r r �
temp_filenamer �offsetr c # �p K � t � } d �� t j t j | � y �w�N)�"_add_ignore_handler_for_interrupts�signal�SIGINT)�original_handlers �8/usr/lib/python3/dist-packages/s3transfer/processpool.py�
ignore_ctrl_cr&