Chapter 24 Quiz: Connecting Python to Cloud Services
Instructions: Answer each question to the best of your ability. The answer key is at the end with explanations.
Section A: Multiple Choice
1. Which of the following is the correct way to pass AWS credentials to a boto3 client in a production script?
A) Hardcode the keys directly in the boto3.client() call
B) Store them in a .env file that is committed to version control
C) Read them from environment variables that are set outside the code
D) Write them in a comment so they are easy to find
2. What does the load_dotenv() function do?
A) Downloads environment variables from AWS
B) Reads a .env file and injects its key-value pairs into the process environment
C) Creates a new .env file with default values
D) Encrypts your credentials before using them
3. You call os.environ.get("AWS_SECRET_ACCESS_KEY") and the variable is not set. What does it return?
A) An empty string ""
B) False
C) None
D) It raises a KeyError
4. Sandra Chen needs to access a report stored in a private S3 bucket. She does not have an AWS account. What is the correct solution?
A) Make the S3 bucket publicly readable B) Give Sandra an AWS account with full S3 access C) Generate a presigned URL with an appropriate expiration time and send her the link D) Email Sandra the AWS access keys so she can use the S3 console
5. Which statement about S3 "folders" is correct?
A) S3 has a true hierarchical folder system like your laptop's file system B) S3 objects have "keys" that can contain slashes, creating an appearance of folders C) You must create folders before uploading files to them D) Folder names in S3 must be globally unique
6. In the list_objects_v2 API call, what does the Prefix parameter do?
A) Sorts results by object key prefix B) Filters results to only objects whose keys begin with that string C) Limits results to objects created with that prefix in their metadata D) Sets the folder where new objects will be stored
7. What is the maximum time a presigned S3 URL can remain valid?
A) 1 hour B) 24 hours C) 7 days D) 30 days
8. The Google Sheets API has a rate limit. Which of these strategies best avoids hitting it?
A) Add time.sleep(1) between every API call
B) Write data cell by cell to spread the load
C) Use worksheet.update("A1", all_data) to write all rows in a single API call
D) Create a new spreadsheet for each data write
9. What is the primary difference between a local SQLite database and a cloud PostgreSQL database from a Python code perspective?
A) The SQL syntax is completely different B) SQLAlchemy cannot be used with cloud databases C) Only the connection string changes — the query code is the same D) Cloud databases require a different version of Python
10. What does pool_pre_ping=True do in a SQLAlchemy create_engine() call?
A) Sends a test ping to the database before each query to check connectivity B) Limits the connection pool to one connection at a time C) Enables SSL encryption for the database connection D) Caches query results for faster subsequent reads
Section B: True or False
11. A .env file should be committed to your git repository so your teammates can use your credentials.
True / False
12. The load_dotenv() function will override environment variables that are already set in the system environment.
True / False
13. AWS Lambda functions can run for a maximum of 15 minutes before they are automatically terminated.
True / False
14. When using the Google Sheets API with a service account, you must share the specific Google Sheet with the service account's email address for the script to access it.
True / False
15. Serverless functions are called "serverless" because they run without any server hardware at all — the computation is distributed across end-user devices.
True / False
16. If you use os.environ.get("MY_KEY") and the variable is not set, your script will crash immediately with an error.
True / False
17. A Google Sheets worksheet created via the API behaves identically to one created manually — other users can view, filter, and export it normally.
True / False
Section C: Short Answer
18. Priya discovers that the AWS access key she has been using was accidentally included in a git commit last week. The repository is private. Describe the two immediate steps she should take, and explain why each is necessary.
(Write 3–5 sentences.)
19. Explain the difference between a presigned URL and making an S3 bucket publicly readable. When would you use each approach, and what are the security trade-offs?
(Write 4–6 sentences.)
20. Marcus Webb asks Priya: "Why does your script need both s3:PutObject and s3:GetObject permissions? Can't you just use one?" Write the explanation Priya would give, covering what each permission allows and why both are needed for the report pipeline.
(Write 3–5 sentences.)
Answer Key
1. C — Environment variables set outside the code are the correct approach. Hardcoding credentials (A) is dangerous. A .env committed to version control (B) exposes credentials. Comments (D) are still in the source code.
2. B — load_dotenv() reads a .env file and injects its key-value pairs into the process environment via os.environ. It does not communicate with AWS (A), create files (C), or encrypt data (D).
3. C — os.environ.get() returns None by default when the variable is not present. This is why validation is essential — None will cause confusing errors later if not caught. To raise an error immediately, use os.environ["KEY"] (which raises KeyError) or validate explicitly.
4. C — A presigned URL grants temporary, time-limited access to a private object without requiring AWS credentials. Making the bucket public (A) exposes all objects to anyone forever. Giving Sandra an AWS account (B) is administrative overhead for a simple sharing need. Sharing access keys (D) is a serious security violation.
5. B — S3 is a flat object store. Keys can contain slashes, and tools like the AWS Console display these as folders for convenience, but no actual folder structure exists. You do not need to create folders before uploading (C), and slashes in keys are not constrained to global uniqueness (D).
6. B — The Prefix parameter filters the listing to only return objects whose keys start with the given string. This simulates "listing a folder" by using the folder path as the prefix.
7. C — AWS allows presigned URLs to be valid for up to 7 days (604,800 seconds). For most business use cases, 24–48 hours is appropriate to balance convenience and security.
8. C — Writing all rows in a single update() call is one API request regardless of row count. Cell-by-cell writes (B) would consume hundreds of requests for a 50-row table. Sleeping (A) wastes time unnecessarily. Creating new spreadsheets (D) does not help.
9. C — SQLAlchemy abstracts the database driver. Change sqlite:///local.db to postgresql://user:pass@host/db in the connection string and the same query code runs against either database. The SQL syntax (A) may need minor adjustments for complex queries, but basic CRUD is identical.
10. A — pool_pre_ping=True runs a lightweight "SELECT 1" test query before giving a connection to your code. If the connection has been dropped (cloud databases often drop idle connections), SQLAlchemy reconnects automatically rather than returning a broken connection.
11. False — A .env file contains real credentials and must never be committed to version control. Commit a .env.example file with empty placeholder values instead.
12. False — load_dotenv() does NOT override existing environment variables by default. This is intentional: production servers set real credentials in their environment, and load_dotenv() will not clobber them. You can force overriding with load_dotenv(override=True).
13. True — AWS Lambda functions have a maximum execution time of 15 minutes (900 seconds). If your task requires more time, you need a different solution such as an EC2 instance, ECS container, or Step Functions workflow.
14. True — Service accounts have their own Google identity (an email address). The sheet must be explicitly shared with that email address, just as you would share it with a human colleague. Without this sharing step, the script receives a SpreadsheetNotFound error.
15. False — The computation runs on actual server hardware in AWS or Google data centers. "Serverless" means you do not manage the servers — the cloud provider handles provisioning, scaling, and maintenance. The hardware still exists.
16. False — os.environ.get("MY_KEY") returns None silently, not an error. The error will typically occur later, when code tries to use None as a string or pass it to an API call. This is why explicit validation (checking for None and raising a helpful error) is a recommended pattern.
17. True — A worksheet created via the API is a standard Google Sheets worksheet. Other users with access to the spreadsheet can view, filter, sort, export, and work with it exactly as they would any other sheet. The API is just an alternative creation method.
18. Priya should immediately: (1) Rotate the exposed credentials — go to the AWS IAM console, deactivate the old access key, and create a new one. This ensures the old key cannot be used even if someone found it. (2) Check AWS CloudTrail logs for any suspicious activity during the period the key was exposed, particularly for unusual API calls, new IAM users being created, or large data transfers. Rotating alone is not enough — she needs to determine whether the key was actually used by an unauthorized party. Deleting the commit from git history is also good practice but does not make the key safe to keep using, because the key could have been scraped before she noticed.
19. A presigned URL is a time-limited, cryptographically signed URL that grants temporary access to one specific private object. It expires after a set duration, cannot be used to access other objects, and requires no AWS credentials from the recipient. Making a bucket publicly readable grants permanent, unrestricted access to all objects in that bucket to any person or automated bot on the internet. Use presigned URLs when sharing specific reports with known recipients on a temporary basis — this is appropriate for business reports, client deliverables, and similar use cases. Use public buckets only for truly public, non-sensitive content like images on a public website where permanent availability is the goal. For business data, presigned URLs are almost always the right choice.
20. s3:PutObject allows writing (uploading) files to the bucket — without it, the script cannot save reports to S3. s3:GetObject allows reading (downloading) files from the bucket — without it, generating presigned URLs that actually work is not possible, because the pre-signed URL mechanism relies on the key having GetObject access. Both permissions are scoped to the specific bucket only, following the least-privilege principle. The IAM policy grants exactly what is needed for the pipeline to function and nothing more, which limits the damage if the credentials are ever compromised.
End of Chapter 24 Quiz