DevSetGo Library
Description | Common functions for Python applications. This is to increase reusability and limit rewritting the same functions in multiple applications. It also allows for defects to be addressed quickly and then be propigated across applications. |
Author(s) | Mike Ryan |
Repository | https://github.com/devsetgo/dsg_lib |
Copyright | Copyright © 2016 - 2024 Mike Ryan |
Table of Contents
Support Python Versions
CI/CD Pipeline:
SonarCloud:
DevSetGo Common Library¶
devsetgo_lib
is a versatile library designed to provide common functions for Python applications. Its main goal is to increase reusability and reduce the need to rewrite the same functions across multiple applications. This also allows for quick defect resolution and propagation of fixes across all dependent projects.
Read the Full Documentation here.
Key Features¶
Common Functions:¶
-
File Operations:
- CSV, JSON, and Text File Functions: Create, read, write, and manipulate various file types with ease.
- Folder Functions: Create and remove directories, list directory contents, and manage file system operations efficiently.
-
Logging: Comprehensive logging setup using the Loguru Library. Provides extensive customization options for log configuration, including log rotation, retention, and formatting. Includes improvements for multiprocessing environments to ensure log messages are handled correctly across multiple processes.
-
Calendar Functions: Convert between month names and numbers seamlessly.
-
Pattern Matching: Powerful tools for searching patterns in text using regular expressions.
FastAPI Endpoints:¶
- Pre-built endpoints for system health checks, status, and uptime monitoring.
- Functions to generate HTTP response codes easily.
Async Database:¶
- Configuration and management of asynchronous database sessions.
- CRUD operations with async support.
Installation¶
To install devsetgo_lib
, use pip:
pip install devsetgo-lib
# For async database setup with SQLite or PostgreSQL
pip install devsetgo-lib[sqlite]
pip install devsetgo-lib[postgres]
# Experimental support for other databases
pip install devsetgo-lib[oracle]
pip install devsetgo-lib[mssql]
pip install devsetgo-lib[mysql]
# For adding FastAPI endpoints
pip install devsetgo-lib[fastapi]
# Install everything
pip install devsetgo-lib[all]
Usage¶
Here's a quick example to demonstrate how you can use some of the key features of devsetgo_lib
:
from devsetgo_lib.common_functions import file_functions, logging_config, patterns, calendar_functions
# File Operations
file_functions.create_sample_files("example", 100)
content = file_functions.read_from_file("example.csv")
print(content)
# Logging
logging_config.config_log(logging_directory='logs', log_name='app.log', logging_level='DEBUG')
logger = logging.getLogger('app_logger')
logger.info("This is an info message")
# Pattern Matching
text = "Hello, my name is 'John Doe' and I live in 'New York'."
results = patterns.pattern_between_two_char(text, "'", "'")
print(results)
# Calendar Functions
print(calendar_functions.get_month(1)) # Output: 'January'
print(calendar_functions.get_month_number('January')) # Output: 1
For detailed documentation on each module and function, please refer to the official documentation.
Contributing¶
We welcome contributions! Please see our contributing guidelines for more details.
License¶
This project is licensed under the MIT License. See the LICENSE file for more details.
Contact¶
For any questions or issues, please open an issue on GitHub or contact us at devsetgo@example.com.
Quick Start¶
Install¶
pip install devsetgo-lib
# Aysync database setup
pip install devsetgo-lib[sqlite]
pip install devsetgo-lib[postgres]
# Consider these experimental and untested
pip install devsetgo-lib[oracle]
pip install devsetgo-lib[mssql]
pip install devsetgo-lib[mysql]
# For adding FastAPI endpoints
pip install devsetgo-lib[fastapi]
# Install everything
pip install devsetgo-lib[all]
See documentation for more examples of library use
Common Functions ↵
Reference¶
dsg_lib.common_functions.logging_config
¶
This module provides a comprehensive logging setup using the loguru library, facilitating easy logging management for Python applications.
The config_log
function, central to this module, allows for extensive customization of logging behavior. It supports specifying the logging directory, log file name, logging level, and controls for log rotation, retention, and formatting among other features. Additionally, it offers advanced options like backtrace and diagnose for in-depth debugging, and the ability to append the application name to the log file for clearer identification.
Usage example:
from dsg_lib.common_functions.logging_config import config_log
config_log(
logging_directory='logs', # Directory for storing logs
log_name='log', # Base name for log files
logging_level='DEBUG', # Minimum logging level
log_rotation='100 MB', # Size threshold for log rotation
log_retention='30 days', # Duration to retain old log files
enqueue=True, # Enqueue log messages
)
# Example log messages
logger.debug("This is a debug message")
logger.info("This is an info message")
logger.error("This is an error message")
logger.warning("This is a warning message")
logger.critical("This is a critical message")
Todo
- Add support for additional logging handlers.
- Implement asynchronous logging.
Date Created
2021/07/16
Date Updated
2024/07/27
License
MIT
SafeFileSink
¶
A class to handle safe file logging with rotation and retention policies.
This class provides mechanisms to manage log files by rotating them based on size and retaining them for a specified duration. It also supports optional compression of log files.
Attributes:
Name | Type | Description |
---|---|---|
|
|
The path to the log file. |
|
|
The size threshold for log rotation in bytes. |
|
|
The duration to retain old log files. |
|
|
The compression method to use for old log files. |
Methods:
Name | Description |
---|---|
|
Parses a size string (e.g., '100MB') and returns the size in bytes. |
|
Parses a duration string (e.g., '7 days') and returns a timedelta object. |
Example
safe_file_sink = SafeFileSink( path='logs/app.log', rotation='100 MB', retention='30 days', compression='zip' )
This will set up a log file at 'logs/app.log' with rotation at 100 MB,¶
retention for 30 days, and compression using zip.¶
Source code in dsg_lib/common_functions/logging_config.py
59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 |
|
__call__(message)
¶
Handles the logging of a message, including writing, rotating, and applying retention policies.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
message
|
|
The log message to be written. |
required |
This method ensures thread-safe logging by acquiring a lock before writing the message, rotating the logs if necessary, and applying the retention policy to remove old log files.
Source code in dsg_lib/common_functions/logging_config.py
135 136 137 138 139 140 141 142 143 144 145 146 147 148 |
|
apply_retention()
¶
Applies the retention policy to remove old log files.
This method iterates through the log files in the directory of the current log file. It checks the modification time of each log file and removes those that are older than the specified retention period.
Returns:
Type | Description |
---|---|
None |
Raises:
Type | Description |
---|---|
|
If there is an error removing a log file. |
Source code in dsg_lib/common_functions/logging_config.py
190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 |
|
parse_duration(duration_str)
staticmethod
¶
Parses a duration string and returns a timedelta object.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
duration_str
|
|
The duration string (e.g., '7 days'). |
required |
Returns:
Name | Type | Description |
---|---|---|
timedelta |
The duration as a timedelta object. |
Source code in dsg_lib/common_functions/logging_config.py
114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 |
|
parse_size(size_str)
staticmethod
¶
Parses a size string and returns the size in bytes.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
size_str
|
|
The size string (e.g., '100MB'). |
required |
Returns:
Name | Type | Description |
---|---|---|
int |
The size in bytes. |
Source code in dsg_lib/common_functions/logging_config.py
93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 |
|
rotate_logs()
¶
Rotates the log file if it exceeds the specified rotation size.
This method checks the size of the current log file. If the file size exceeds the specified rotation size, it renames the current log file by appending a timestamp to its name. Optionally, it compresses the rotated log file using the specified compression method and removes the original uncompressed file.
Returns:
Type | Description |
---|---|
None |
Raises:
Type | Description |
---|---|
|
If there is an error renaming or compressing the log file. |
Source code in dsg_lib/common_functions/logging_config.py
162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 |
|
write_message(message)
¶
Writes a log message to the log file.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
message
|
|
The log message to be written. |
required |
This method opens the log file in append mode and writes the message to it.
Source code in dsg_lib/common_functions/logging_config.py
150 151 152 153 154 155 156 157 158 159 160 |
|
config_log(logging_directory='log', log_name='log', logging_level='INFO', log_rotation='100 MB', log_retention='30 days', log_backtrace=False, log_format=None, log_serializer=False, log_diagnose=False, app_name=None, append_app_name=False, enqueue=True, intercept_standard_logging=True, compression='zip')
¶
Configures the logging settings for the application.
This function sets up the logging configuration, including the log directory, log file name, logging level, log rotation, retention policies, and other optional settings.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
logging_directory
|
|
The directory where log files will be stored. Defaults to "log". |
'log'
|
log_name
|
|
The base name of the log file. Defaults to "log". |
'log'
|
logging_level
|
|
The logging level (e.g., "INFO", "DEBUG"). Defaults to "INFO". |
'INFO'
|
log_rotation
|
|
The size threshold for log rotation (e.g., "100 MB"). Defaults to "100 MB". |
'100 MB'
|
log_retention
|
|
The duration to retain old log files (e.g., "30 days"). Defaults to "30 days". |
'30 days'
|
log_backtrace
|
|
Whether to include backtrace information in logs. Defaults to False. |
False
|
log_format
|
|
The format string for log messages. Defaults to a predefined format if not provided. |
None
|
log_serializer
|
|
Whether to serialize log messages. Defaults to False. |
False
|
log_diagnose
|
|
Whether to include diagnostic information in logs. Defaults to False. |
False
|
app_name
|
|
The name of the application. Defaults to None. |
None
|
append_app_name
|
|
Whether to append the application name to the log file name. Defaults to False. |
False
|
enqueue
|
|
Whether to enqueue log messages for asynchronous processing. Defaults to True. |
True
|
intercept_standard_logging
|
|
Whether to intercept standard logging calls. Defaults to True. |
True
|
compression
|
|
The compression method for rotated log files (e.g., "zip"). Defaults to 'zip'. |
'zip'
|
Returns:
Type | Description |
---|---|
None |
Example
config_log( logging_directory='logs', log_name='app_log', logging_level='DEBUG', log_rotation='50 MB', log_retention='7 days', log_backtrace=True, log_format='{time} - {level} - {message}', log_serializer=True, log_diagnose=True, app_name='MyApp', append_app_name=True, enqueue=False, intercept_standard_logging=False, compression='gz' )
This will configure the logging settings with the specified parameters, setting up a log file at 'logs/app_log' with rotation at 50 MB, retention for 7 days, and other specified options.
Source code in dsg_lib/common_functions/logging_config.py
217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 |
|
Reference¶
dsg_lib.common_functions.file_functions
¶
file_functions.py
This module provides a function to delete a file with a specified name from a specified directory.
Functions:
Name | Description |
---|---|
|
str) -> str:
Deletes a file with the specified file name from the directory specified
by the Args: file_name (str): The name of the file to be deleted. Returns: str: A string indicating that the file has been deleted. Raises: TypeError: If the file name is not a string. ValueError: If the file name contains a forward slash or backslash, or if the file type is not supported. FileNotFoundError: If the file does not exist. |
Example:
from dsg_lib.common_functions import file_functions
file_functions.delete_file("test.csv")
# Outputs: 'complete'
Author: Mike Ryan Date: 2024/05/16 License: MIT
create_sample_files(file_name, sample_size)
¶
Create sample CSV and JSON files with random data.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
file_name
|
|
The base name for the sample files (without extension). |
required |
sample_size
|
|
The number of rows to generate for the sample files. |
required |
Returns:
Type | Description |
---|---|
None
|
None |
Raises:
Type | Description |
---|---|
|
If an error occurs while creating the sample files. |
Example: ```python from dsg_lib.common_functions import file_functions
file_functions.create_sample_files("test", 100)
Creates 'test.csv' and 'test.json' each with 100 rows of random data ```¶
Source code in dsg_lib/common_functions/file_functions.py
440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 |
|
delete_file(file_name)
¶
Deletes a file with the specified file name from the specified directory. The file type is determined by the file extension.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
directory_to_files
|
|
The directory where the file is located. |
required |
file_name
|
|
The name of the file to be deleted. |
required |
Returns:
Name | Type | Description |
---|---|---|
str |
|
A message indicating whether the file has been deleted successfully |
|
or an error occurred. |
Raises:
Type | Description |
---|---|
|
If the directory or file name is not a string. ValueError: If |
is not supported. FileNotFoundError
|
If the file does not exist. |
Example:
from dsg_lib.common_functions import file_functions
file_functions.delete_file("test.csv")
# Outputs: 'File deleted successfully'
Source code in dsg_lib/common_functions/file_functions.py
60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 |
|
generate_random_date()
¶
Generate a random datetime string in the format yyyy-mm-dd hh:mm:ss.ffffff.
Returns:
Name | Type | Description |
---|---|---|
str |
|
A randomly generated datetime string. |
Example:
from dsg_lib.common_functions import file_functions
random_date = file_functions.generate_random_date()
# Returns: '1992-03-15 10:30:45.123456'
Source code in dsg_lib/common_functions/file_functions.py
508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 |
|
open_csv(file_name, delimiter=',', quote_level='minimal', skip_initial_space=True)
¶
Opens a CSV file with the specified file name and returns its contents as a list of dictionaries.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
file_name
|
|
The name of the file to open. Should include the '.csv' |
required |
extension.
|
delimiter (str
|
The character used to separate |
required |
fields
|
in the CSV file. Defaults to ','. quote_level (str
|
|
required |
optional)
|
Whether to skip initial whitespace in the CSV file. Defaults |
required |
Returns:
Name | Type | Description |
---|---|---|
list |
|
The contents of the CSV file as a list of dictionaries. Each |
|
dictionary represents a row in the CSV file, where the keys are column |
|
|
names and the values are the data for those columns. |
Raises:
Type | Description |
---|---|
|
If |
is not a valid level. FileNotFoundError
|
If the file does not exist. |
Example:
from dsg_lib.common_functions import file_functions data =
file_functions.open_csv("test.csv", delimiter=";", quote_level="all",
skip_initial_space=False) # Returns: [{'column1': 'value1', 'column2':
'value2'}]
Source code in dsg_lib/common_functions/file_functions.py
315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 |
|
open_json(file_name)
¶
Open a JSON file and load its contents into a dictionary.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
file_name
|
|
The name of the JSON file to open. |
required |
Returns:
Name | Type | Description |
---|---|---|
dict |
|
The contents of the JSON file as a dictionary. |
Raises:
Type | Description |
---|---|
|
If the file name is not a string. FileNotFoundError: If the |
Source code in dsg_lib/common_functions/file_functions.py
194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 |
|
open_text(file_name)
¶
Opens a text file with the specified file name and returns its contents as a string.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
file_name
|
|
The name of the file to open. Should include the '.txt' |
required |
Returns:
Name | Type | Description |
---|---|---|
str |
|
The contents of the text file as a string. |
Raises:
Type | Description |
---|---|
|
If the |
forward slash. FileNotFoundError
|
If the file does not exist. |
Example:
from dsg_lib.common_functions import file_functions
data = file_functions.open_text("test.txt")
# Returns: 'This is a test text file.'
Source code in dsg_lib/common_functions/file_functions.py
604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 |
|
save_csv(file_name, data, root_folder=None, delimiter=',', quotechar='"')
¶
Saves a list of dictionaries as a CSV file with the specified file name in the specified directory. Each dictionary in the list should represent a row in the CSV file.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
file_name
|
|
The name of the file to save the data in. Should |
required |
include
|
the '.csv' extension. data (list
|
The data to be saved. Each |
required |
optional)
|
The root directory where the file will be saved. If None, the |
required | |
(str,
|
|
The character used to separate fields in the CSV file. |
required |
Defaults
|
to ','. quotechar (str
|
The character used to quote |
required |
Returns:
Name | Type | Description |
---|---|---|
str |
|
A message indicating whether the file has been saved successfully |
|
or an error occurred. |
Raises:
Type | Description |
---|---|
|
If the data is not a list, or the file name, delimiter, or |
quotechar is not a string. ValueError
|
If the file name does not end |
Example: ```python from dsg_lib.common_functions import file_functions
data = [{"column1": "value1", "column2": "value2"}]
file_functions.save_csv("test.csv", data, "/path/to/directory", delimiter=";", quotechar="'")
Saves data to '/path/to/directory/test.csv'¶
```
Source code in dsg_lib/common_functions/file_functions.py
235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 |
|
save_json(file_name, data, root_folder=None)
¶
Saves a dictionary or a list as a JSON file with the specified file name in the specified directory.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
file_name
|
|
The name of the file to save the data in. Should |
required |
include
|
the '.json' extension. data (list or dict
|
The data to be |
required |
saved.
|
root_folder (str
|
The root directory where the file |
required |
Returns:
Name | Type | Description |
---|---|---|
str |
|
A message indicating whether the file has been saved successfully |
|
or an error occurred. |
Raises:
Type | Description |
---|---|
|
If the data is not a list or a dictionary, or the file name |
or directory is not a string. ValueError
|
If the file name contains a |
Example:
from dsg_lib.common_functions import file_functions
data = {"key": "value"}
file_functions.save_json("test.json", data, "/path/to/directory")
# Saves data to '/path/to/directory/test.json'
Source code in dsg_lib/common_functions/file_functions.py
122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 |
|
save_text(file_name, data, root_folder=None)
¶
Saves a string of text to a file with the specified file name in the specified directory.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
file_name
|
|
The name of the file to save the data in. Should not |
required |
include
|
the '.txt' extension. data (str
|
The text data to be saved. |
required |
root_folder
|
|
The root directory where the file will be |
None
|
Returns:
Name | Type | Description |
---|---|---|
str |
|
A message indicating whether the file has been saved successfully |
|
or an error occurred. |
Raises:
Type | Description |
---|---|
|
If the |
contains a forward slash or backslash. FileNotFoundError
|
If the |
Example:
from dsg_lib.common_functions import file_functions
file_functions.save_text("test", "This is a test text file.", "/path/to/directory")
# Saves data to '/path/to/directory/test.txt'
Source code in dsg_lib/common_functions/file_functions.py
541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 |
|
Reference¶
dsg_lib.common_functions.folder_functions
¶
This module contains functions for working with directories and files.
Functions:
Name | Description |
---|---|
|
Get the last modified file in a |
|
Get a list of directories in the |
|
Make a folder in a |
|
Remove a folder from the |
Example:
from dsg_lib.common_functions import folder_functions
# Get the last modified file in a directory time_stamp, file_path =
folder_functions.last_data_files_changed("/path/to/directory") # Returns:
(datetime.datetime(2022, 1, 1, 12, 0, 0), '/path/to/directory/test.txt')
# Get a list of directories in the specified directory directories =
folder_functions.get_directory_list("/path/to/directory") # Returns:
['/path/to/directory/dir1', '/path/to/directory/dir2']
# Make a folder in a specific directory
folder_functions.make_folder("/path/to/directory/new_folder") # Creates a new
folder at '/path/to/directory/new_folder'
# Remove a folder from the specified directory
folder_functions.remove_folder("/path/to/directory/old_folder") # Removes the
folder at '/path/to/directory/old_folder'
Author: Mike Ryan Date: 2024/05/16 License: MIT
get_directory_list(file_directory)
¶
Get a list of directories in the specified directory.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
file_directory
|
|
The path of the directory to check. |
required |
Returns:
Type | Description |
---|---|
|
List[str]: A list of directories in the specified directory. |
Raises:
Type | Description |
---|---|
|
If the directory does not exist. |
Example:
from dsg_lib import file_functions
directories = file_functions.get_directory_list("/path/to/directory")
# Returns: ['/path/to/directory/dir1', '/path/to/directory/dir2']
Source code in dsg_lib/common_functions/folder_functions.py
101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 |
|
last_data_files_changed(directory_path)
¶
Get the last modified file in a directory and return its modification time and path.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
directory_path
|
|
The path of the directory to check. |
required |
Returns:
Type | Description |
---|---|
|
Tuple[datetime, str]: A tuple containing the modification time and path |
|
of the last modified file. |
Raises:
Type | Description |
---|---|
|
If the directory does not exist. |
Example:
from dsg_lib import file_functions
time_stamp, file_path = file_functions.last_data_files_changed("/path/to/directory")
# Returns: (datetime.datetime(2022, 1, 1, 12, 0, 0), '/path/to/directory/test.txt')
Source code in dsg_lib/common_functions/folder_functions.py
55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 |
|
remove_folder(file_directory)
¶
Remove a folder from the specified directory.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
file_directory
|
|
The directory containing the folder to be removed. |
required |
Returns:
Type | Description |
---|---|
None
|
None. |
Raises:
Type | Description |
---|---|
|
If the specified directory does not exist. OSError: |
Example:
from dsg_lib.common_functions import file_functions
file_functions.remove_folder("/path/to/directory/old_folder")
# Removes the folder at '/path/to/directory/old_folder'
Source code in dsg_lib/common_functions/folder_functions.py
187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 |
|
Reference¶
dsg_lib.common_functions.patterns
¶
This module contains functions for pattern searching in text using regular expressions.
The main function in this module is pattern_between_two_char
, which searches
for all patterns between two characters in a given string. The function uses
Python's built-in re
module for regex searching and the loguru
module for
logging.
Functions:
Name | Description |
---|---|
|
str, left_characters: str, |
|
str) -> dict: Searches for all patterns between two characters (left and right) in a given string using regular expressions. |
Example
from dsg_lib.common_functions import patterns
text = "Hello, my name is 'John Doe' and I live in 'New York'." left_char =
"'" right_char = "'"
results = patterns.pattern_between_two_char(text, left_char, right_char)
print(results) ``` This will output: ```python {
'found': ['John Doe', 'New York'], 'matched_found': 2,
'pattern_parameters': {
'left_character': "'", 'right_character': "'", 'regex_pattern':
"'(.+?)'", 'text_string': "Hello, my name is 'John Doe' and I live
in 'New York'."
}
}
Author: Mike Ryan Date: 2024/05/16 License: MIT
pattern_between_two_char(text_string, left_characters, right_characters)
¶
Searches for all patterns between two characters (left and right) in a given string using regular expressions.
This function takes a string and two characters as input, and returns a dictionary containing all patterns found between the two characters in the string. The dictionary also includes the number of matches found and the regex pattern used for searching.
The function uses Python's built-in re
module for regex searching and the
loguru
module for logging.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
text_string
|
|
The string in which to search for patterns. |
required |
left_characters
|
|
The character(s) that appear(s) immediately to |
required |
the
|
left of the desired pattern. right_characters (str
|
The |
required |
Returns:
Name | Type | Description |
---|---|---|
dict |
|
A dictionary with the following keys: - "found": a list of strings containing all patterns found. - "matched_found": the number of patterns found. - "pattern_parameters": a dictionary with the following keys: - "left_character": the escaped left character string used to build the regex pattern. - "right_character": the escaped right character string used to build the regex pattern. - "regex_pattern": the final regex pattern used for searching. - "text_string": the escaped input string used for searching. |
Example
from dsg_lib.common_functions import patterns
text = "Hello, my name is 'John Doe' and I live in 'New York'."
left_char = "'" right_char = "'"
results = patterns.pattern_between_two_char(text, left_char, right_char)
print(results) ``` This will output: ```python {
'found': ['John Doe', 'New York'], 'matched_found': 2,
'pattern_parameters': {
'left_character': "'", 'right_character': "'", 'regex_pattern':
"'(.+?)'", 'text_string': "Hello, my name is 'John Doe' and I
live in 'New York'."
}
}
Source code in dsg_lib/common_functions/patterns.py
48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 |
|
Reference¶
dsg_lib.common_functions.calendar_functions
¶
This module provides two main functions to convert between month numbers and their corresponding names.
Functions:
Name | Description |
---|---|
|
int) -> str: Converts an integer month number to its corresponding month name. Args: month (int): An integer between 1 and 12 representing the month number. Returns: str: The full name of the month corresponding to the input month number. If the input is not within the range of 1-12, returns "Invalid month number". If the input is not an integer, returns "Invalid input, integer is required". |
|
str) -> int: Converts a month name to its corresponding month number. Args: month_name (str): A string containing the full name of a month. Returns: int: The month number corresponding to the input month name. If the input is not a valid month name, returns -1. If the input is not a string, returns "Invalid input, string is required". |
Example:
from dsg_lib.common_functions.calendar_functions import get_month,
get_month_number print(get_month(1))
# Outputs: 'January'
print(get_month_number('January'))
# Outputs: 1
This module is part of the dsg_lib package and is used for handling and converting between month numbers and names.
Author: Mike Ryan Date: 2024/05/16 License: MIT
get_month(month)
¶
Converts an integer month number to its corresponding month name.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
month
|
|
An integer or integer-like float between 1 and 12 |
required |
Returns:
Name | Type | Description |
---|---|---|
str |
|
The full name of the month corresponding to the input month number. If the input is not within the range of 1-12, returns "Invalid month number". If the input is not an integer or integer-like float, returns "Invalid input, integer is required". |
Source code in dsg_lib/common_functions/calendar_functions.py
58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 |
|
get_month_number(month_name)
¶
Converts a month name to its corresponding month number.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
month_name
|
|
A string containing the full name of a month. |
required |
Returns:
Name | Type | Description |
---|---|---|
int |
|
The month number corresponding to the input month name. If the input is not a valid month name or not a string, returns -1. |
Source code in dsg_lib/common_functions/calendar_functions.py
109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 |
|
Ended: Common Functions
FastAPI Functions ↵
Reference¶
dsg_lib.fastapi_functions.http_codes
¶
http_codes.py
This module provides a dictionary of HTTP status codes and their descriptions.
The dictionary ALL_HTTP_CODES
contains the HTTP status codes as keys. Each key
maps to another dictionary that contains a description of the status code, an
extended description, and a link to its documentation on the Mozilla Developer Network (MDN).
Example:
from dsg_lib.fastapi_functions import http_codes
# Get the description, extended description, and link for HTTP status code 200
status_200 = http_codes.ALL_HTTP_CODES[200]
print(status_200)
# {'description': 'OK', 'extended_description': 'The request has succeeded', 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/200'}
Attributes:
Name | Type | Description |
---|---|---|
|
|
A dictionary of HTTP status codes. Each key is an |
Author: Mike Ryan Date: 2024/05/16 License: MIT
DELETE_CODES = generate_code_dict(common_codes + [202, 204, 205, 409])
module-attribute
¶
DELETE_CODES is a dictionary of HTTP status codes for DELETE requests. It includes all the common codes, plus some additional codes that are specific to DELETE requests.
Example:
from dsg_lib.fastapi_functions import http_codes
# Print the dictionary of HTTP status codes for DELETE requests
print(http_codes.DELETE_CODES)
GET_CODES = generate_code_dict(common_codes + [206, 304, 307, 410, 502])
module-attribute
¶
GET_CODES is a dictionary of HTTP status codes for GET requests. It includes all the common codes, plus some additional codes that are specific to GET requests.
Example:
from dsg_lib.fastapi_functions import http_codes
# Print the dictionary of HTTP status codes for GET requests
print(http_codes.GET_CODES)
PATCH_CODES = generate_code_dict(common_codes + [202, 204, 206, 409, 412, 413])
module-attribute
¶
PATCH_CODES is a dictionary of HTTP status codes for PATCH requests. It includes all the common codes, plus some additional codes that are specific to PATCH requests.
Example:
from dsg_lib.fastapi_functions import http_codes
# Print the dictionary of HTTP status codes for PATCH requests
print(http_codes.PATCH_CODES)
POST_CODES = generate_code_dict(common_codes + [201, 202, 205, 307, 409, 413, 415])
module-attribute
¶
POST_CODES is a dictionary of HTTP status codes for POST requests. It includes all the common codes, plus some additional codes that are specific to POST requests.
Example:
from dsg_lib.fastapi_functions import http_codes
# Print the dictionary of HTTP status codes for POST requests
print(http_codes.POST_CODES)
PUT_CODES = generate_code_dict(common_codes + [202, 204, 206, 409, 412, 413])
module-attribute
¶
PUT_CODES is a dictionary of HTTP status codes for PUT requests. It includes all the common codes, plus some additional codes that are specific to PUT requests.
Example:
from dsg_lib.fastapi_functions import http_codes
# Print the dictionary of HTTP status codes for PUT requests
print(http_codes.PUT_CODES)
generate_code_dict(codes, description_only=False)
¶
Generate a dictionary of specific HTTP error codes from the http_codes dictionary.
This function takes a list of HTTP status codes and an optional boolean flag. If the flag is True, the function returns a dictionary where each key is an HTTP status code from the input list and each value is the corresponding description from the ALL_HTTP_CODES dictionary. If the flag is False, the function returns a dictionary where each key is an HTTP status code from the input list and each value is the corresponding dictionary from the ALL_HTTP_CODES dictionary.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
codes
|
|
A list of HTTP status codes. |
required |
description_only
|
|
If True, only the description of the codes will be returned. |
False
|
Returns:
Name | Type | Description |
---|---|---|
dict |
|
A dictionary where each key is an HTTP error code from the input |
|
list and each value depends on the description_only parameter. If |
|
|
description_only is True, the value is the description string. If |
|
|
description_only is False, the value is a dictionary with keys |
|
|
'description', 'extended_description', and 'link'. |
Example:
from dsg_lib.fastapi_functions import http_codes
# Generate a dictionary for HTTP status codes 200 and 404
status_dict = http_codes.generate_code_dict([200, 404])
print(status_dict)
# {200: {'description': 'OK', 'extended_description': 'The request has succeeded', 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/200'},
# 404: {'description': 'Not Found', 'extended_description': 'The requested resource could not be found', 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404'}}
# Generate a dictionary for HTTP status codes 200 and 404 with only descriptions
status_dict = http_codes.generate_code_dict([200, 404], description_only=True)
print(status_dict) # {200: 'OK', 404: 'Not Found'}
Source code in dsg_lib/fastapi_functions/http_codes.py
49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 |
|
dsg_lib.fastapi_functions._all_codes
¶
This module contains a dictionary mapping HTTP status codes to their descriptions, extended descriptions, and links to their documentation.
Each key in this dictionary is an HTTP status code, and each value is another dictionary with keys 'description', 'extended_description', and 'link'.
The 'description' key maps to a brief string that describes the HTTP status code. The 'extended_description' key maps to a more detailed explanation of the status code. The 'link' key maps to a string that is a link to the documentation for the HTTP status code.
Example
from dsg_lib.fastapi_functions.http_codes import ALL_HTTP_CODES
# Get the dictionary for HTTP status code 200
status_200 = ALL_HTTP_CODES[200]
print(status_200)
# Output: {'description': 'OK', 'extended_description': 'The request has succeeded.', 'link': 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/200'}
# Get the description for HTTP status code 404
description_404 = ALL_HTTP_CODES[404]['description']
print(description_404) # Output: 'Not Found'
# Get the extended description for HTTP status code 200
extended_description_200 = ALL_HTTP_CODES[200]['extended_description']
print(extended_description_200) # Output: 'The request has succeeded.'
# Get the link to the documentation for HTTP status code 500
link_500 = ALL_HTTP_CODES[500]['link']
print(link_500) # Output: 'https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/500'
Author: Mike Ryan Date: 2024/05/16 License: MIT
Reference¶
dsg_lib.fastapi_functions.system_health_endpoints
¶
This module provides a configurable health endpoint for a FastAPI application. It includes the following routes:
-
/api/health/status
: Returns the status of the application. If the application is running, it will return{"status": "UP"}
. This endpoint can be enabled or disabled using the configuration. -
/api/health/uptime
: Returns the uptime of the application in a dictionary with the keys "Days", "Hours", "Minutes", and "Seconds". The uptime is calculated from the time the application was started. This endpoint can be enabled or disabled using the configuration. -
/api/health/heapdump
: Returns a heap dump of the application. The heap dump is a list of dictionaries, each representing a line of code. Each dictionary includes the filename, line number, size of memory consumed, and the number of times the line is referenced. This endpoint can be enabled or disabled using the configuration.
The module uses the FastAPI
, time
, tracemalloc
, loguru
, packaging
, and
dsg_lib.fastapi.http_codes
modules.
Functions:
Name | Description |
---|---|
|
dict) -> FastAPI.APIRouter: Creates a FastAPI router with health endpoints based on the provided configuration. |
Example
from FastAPI import FastAPI
from dsg_lib.fastapi_functions import
system_health_endpoints
app = FastAPI()
# User configuration
config = {
"enable_status_endpoint": True,
"enable_uptime_endpoint": False,
"enable_heapdump_endpoint": True,
}
# Health router
health_router =
system_health_endpoints.create_health_router(config)
app.include_router(health_router, prefix="/api/health",
tags=["system-health"])
# Get the status of the application
response = client.get("/api/health/status")
print(response.json()) # {"status": "UP"}
# Get the uptime of the application response =
client.get("/api/health/uptime")
print(response.json())
# {"uptime": {"Days": 0, "Hours": 0, "Minutes": 1, "Seconds": 42.17}}
# Get the heap dump of the application response =
client.get("/api/health/heapdump")
print(response.json())
# {"memory_use":{"current": "123456", "peak": "789012"}, "heap_dump": [{"filename": "main.py", "lineno": 10, "size": 1234, "count": 1}, ...]}
Author: Mike Ryan Date: 2024/05/16 License: MIT
create_health_router(config)
¶
Create a health router with the following endpoints:
-
/status
: Returns the status of the application. This endpoint can be enabled or disabled using theenable_status_endpoint
key in the configuration. -
/uptime
: Returns the uptime of the application. This endpoint can be enabled or disabled using theenable_uptime_endpoint
key in the configuration. -
/heapdump
: Returns a heap dump of the application. This endpoint can be enabled or disabled using theenable_heapdump_endpoint
key in the configuration.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
config
|
|
A dictionary with the configuration for the endpoints. |
required |
Returns:
Name | Type | Description |
---|---|---|
APIRouter |
A FastAPI router with the configured endpoints. |
Example
from FastAPI import FastAPI
from dsg_lib.fastapi_functions import
system_health_endpoints
app = FastAPI()
# User configuration
config = {
"enable_status_endpoint": True,
"enable_uptime_endpoint": False,
"enable_heapdump_endpoint": True,
}
# Health router
health_router =
system_health_endpoints.create_health_router(config)
app.include_router(health_router, prefix="/api/health",
tags=["system-health"])
# Get the status of the application
response = client.get("/api/health/status")
print(response.json()) # {"status": "UP"}
# Get the uptime of the application response =
client.get("/api/health/uptime")
print(response.json())
# {"uptime": {"Days": 0, "Hours": 0, "Minutes": 1, "Seconds": 42.17}}
# Get the heap dump of the application response =
client.get("/api/health/heapdump")
print(response.json())
# {"memory_use":{"current": "123456", "peak": "789012"}, "heap_dump": [{"filename": "main.py", "lineno": 10, "size": 1234, "count": 1}, ...]}
Source code in dsg_lib/fastapi_functions/system_health_endpoints.py
82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 |
|
Ended: FastAPI Functions
Database Functions ↵
Reference¶
dsg_lib.async_database_functions.base_schema
¶
This module defines the base schema for database models in the application.
The module uses SQLAlchemy as the ORM and provides a SchemaBase
class that all
other models should inherit from. The SchemaBase
class includes common columns
that are needed for most models like pkid
, date_created
, and date_updated
.
pkid
: A unique identifier for each record. It's a string representation of a UUID.date_created
: The date and time when a particular row was inserted into the table. It defaults to the current UTC time when the instance is created.date_updated
: The date and time when a particular row was last updated. It defaults to the current UTC time whenever the instance is updated.
To create a new database model, import this module and extend the SchemaBase
class.
Example:
from dsg_lib.async_database_functions import base_schema
class MyModel(base_schema.SchemaBaseSQLite):
# Define your model-specific columns here my_column =
base_schema.Column(base_schema.String(50))
Author: Mike Ryan Date: 2024/05/16 License: MIT
SchemaBaseCockroachDB
¶
This class provides a base schema that includes common columns for most models when using a CockroachDB database. CockroachDB uses the same syntax as PostgreSQL. All other models should inherit from this class.
Attributes:
Name | Type | Description |
---|---|---|
|
|
A unique identifier for each record. It's a string |
|
|
The date and time when a particular row was |
|
|
The date and time when a particular row was |
Example:
from dsg_lib.async_database_functions import base_schema
from sqlalchemy.orm import declarative_base
BASE = declarative_base()
class MyModel(base_schema.SchemaBaseCockroachDB, BASE):
# Define your model-specific columns here
my_column = base_schema.Column(base_schema.String(50))
Source code in dsg_lib/async_database_functions/base_schema.py
486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 |
|
SchemaBaseFirebird
¶
This class provides a base schema that includes common columns for most models when using a Firebird database. All other models should inherit from this class.
Attributes:
Name | Type | Description |
---|---|---|
|
|
A unique identifier for each record. It's a string |
|
|
The date and time when a particular row was |
|
|
The date and time when a particular row was |
Example:
from dsg_lib.async_database_functions import base_schema
from sqlalchemy.orm import declarative_base
BASE = declarative_base()
class MyModel(base_schema.SchemaBaseFirebird, BASE):
# Define your model-specific columns here
my_column = base_schema.Column(base_schema.String(50))
Source code in dsg_lib/async_database_functions/base_schema.py
368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 |
|
SchemaBaseMSSQL
¶
This class provides a base schema that includes common columns for most models when using a Microsoft SQL Server database. All other models should inherit from this class.
Attributes:
Name | Type | Description |
---|---|---|
|
|
A unique identifier for each record. It's a string |
|
|
The date and time when a particular row was |
|
|
The date and time when a particular row was |
Example:
from dsg_lib.async_database_functions import base_schema
from sqlalchemy.orm import declarative_base
BASE = declarative_base()
class MyModel(base_schema.SchemaBaseMSSQL, BASE):
# Define your model-specific columns here
my_column = base_schema.Column(base_schema.String(50))
Source code in dsg_lib/async_database_functions/base_schema.py
309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 |
|
SchemaBaseMySQL
¶
This class provides a base schema that includes common columns for most models when using a MySQL database. All other models should inherit from this class.
Attributes:
Name | Type | Description |
---|---|---|
|
|
A unique identifier for each record. It's a string |
|
|
The date and time when a particular row was |
|
|
The date and time when a particular row was |
Example:
from dsg_lib.async_database_functions import base_schema
from sqlalchemy.orm import declarative_base
BASE = declarative_base()
class MyModel(base_schema.SchemaBaseMySQL, BASE):
# Define your model-specific columns here
my_column = base_schema.Column(base_schema.String(50))
Source code in dsg_lib/async_database_functions/base_schema.py
191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 |
|
SchemaBaseOracle
¶
This class provides a base schema that includes common columns for most models when using an Oracle database. All other models should inherit from this class.
Attributes:
Name | Type | Description |
---|---|---|
|
|
A unique identifier for each record. It's a string |
|
|
The date and time when a particular row was |
|
|
The date and time when a particular row was |
Example:
from dsg_lib.async_database_functions import base_schema
from sqlalchemy.orm import declarative_base
BASE = declarative_base()
class MyModel(base_schema.SchemaBaseOracle, BASE):
# Define your model-specific columns here
my_column = base_schema.Column(base_schema.String(50))
Source code in dsg_lib/async_database_functions/base_schema.py
250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 |
|
SchemaBasePostgres
¶
This class provides a base schema that includes common columns for most models when using a PostgreSQL database. All other models should inherit from this class.
Attributes:
Name | Type | Description |
---|---|---|
|
|
A unique identifier for each record. It's a string |
|
|
The date and time when a particular row was |
|
|
The date and time when a particular row was |
Example:
from dsg_lib.async_database_functions import base_schema
from sqlalchemy.orm import declarative_base
BASE = declarative_base()
class MyModel(base_schema.SchemaBasePostgres, BASE):
# Define your model-specific columns here
my_column = base_schema.Column(base_schema.String(50))
Source code in dsg_lib/async_database_functions/base_schema.py
132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 |
|
SchemaBaseSQLite
¶
This class provides a base schema that includes common columns for most models. All other models should inherit from this class.
Attributes:
Name | Type | Description |
---|---|---|
|
|
A unique identifier for each record. It's a string |
|
|
The date and time when a particular row was |
|
|
The date and time when a particular row was |
Example:
from dsg_lib.async_database_functions import base_schema
from sqlalchemy.orm import declarative_base
BASE = declarative_base()
class MyModel(base_schema.SchemaBase, BASE):
# Define your model-specific columns here
my_column = base_schema.Column(base_schema.String(50))
Source code in dsg_lib/async_database_functions/base_schema.py
74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 |
|
SchemaBaseSybase
¶
This class provides a base schema that includes common columns for most models when using a Sybase database. All other models should inherit from this class.
Attributes:
Name | Type | Description |
---|---|---|
|
|
A unique identifier for each record. It's a string |
|
|
The date and time when a particular row was |
|
|
The date and time when a particular row was |
Example:
from dsg_lib.async_database_functions import base_schema
from sqlalchemy.orm import declarative_base
BASE = declarative_base()
class MyModel(base_schema.SchemaBaseSybase, BASE):
# Define your model-specific columns here
my_column = base_schema.Column(base_schema.String(50))
Source code in dsg_lib/async_database_functions/base_schema.py
427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 |
|
Reference¶
dsg_lib.async_database_functions.database_config
¶
This module provides classes and functions for managing asynchronous database operations using SQLAlchemy and asyncio.
The main classes are DBConfig, which manages the database configuration and creates a SQLAlchemy engine and a MetaData instance, and AsyncDatabase, which uses an instance of DBConfig to perform asynchronous database operations.
The module also provides a function, import_sqlalchemy, which tries to import SQLAlchemy and its components, and raises an ImportError if SQLAlchemy is not installed or if the installed version is not compatible.
The module uses the logger from the dsg_lib
for logging, and the time
module
for working with times. It also uses the contextlib
module for creating
context managers, and the typing
module for type hinting.
The BASE
variable is a base class for declarative database models. It is
created using the declarative_base
function from sqlalchemy.orm
.
This module is part of the dsg_lib
package, which provides utilities for
working with databases in Python.
Example:
from dsg_lib.async_database_functions import database_config
# Define your database configuration config = {
"database_uri": "postgresql+asyncpg://user:password@localhost/dbname",
"echo": True, "future": True, "pool_pre_ping": True, "pool_size": 5,
"max_overflow": 10, "pool_recycle": 3600, "pool_timeout": 30,
}
# Create a DBConfig instance db_config = database_config.DBConfig(config)
# Use the DBConfig instance to get a database session async with
db_config.get_db_session() as session:
# Perform your database operations here pass
Author: Mike Ryan Date: 2024/05/16 License: MIT
DBConfig
¶
A class used to manage the database configuration and create a SQLAlchemy engine.
Attributes:
Name | Type | Description |
---|---|---|
|
|
A dictionary containing the database configuration |
|
engine (Engine
|
The SQLAlchemy engine created with the |
|
URI from the config. metadata (MetaData
|
The SQLAlchemy |
Create Engine Support Functions by Database Type Confirmed by testing [SQLITE, PostrgeSQL] To Be Tested [MySQL, Oracle, MSSQL] and should be considered experimental ------- Option SQLite PostgreSQL MySQL Oracle MSSQL echo Yes Yes Yes Yes Yes future Yes Yes Yes Yes Yes pool_pre_ping Yes Yes Yes Yes Yes pool_size No Yes Yes Yes Yes max_overflow No Yes Yes Yes Yes pool_recycle Yes Yes Yes Yes Yes pool_timeout No Yes Yes Yes Yes
Example:
from dsg_lib.async_database_functions import database_config
# Define your database configuration config = {
"database_uri": "postgresql+asyncpg://user:password@localhost/dbname",
"echo": True, "future": True, "pool_pre_ping": True, "pool_size": 5,
"max_overflow": 10, "pool_recycle": 3600, "pool_timeout": 30,
}
# Create a DBConfig instance db_config = database_config.DBConfig(config)
# Use the DBConfig instance to get a database session async with
db_config.get_db_session() as session:
# Perform your database operations here pass
Source code in dsg_lib/async_database_functions/database_config.py
80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 |
|
__init__(config)
¶
Initializes the DBConfig instance with the given database configuration.
The configuration should be a dictionary with the following keys: -
"database_uri": The URI for the database. - "echo": If True, the engine
will log all statements as well as a repr()
of their parameter lists
to the engines logger, which defaults to sys.stdout. - "future": If
True, use the future version of SQLAlchemy, which supports asyncio. -
"pool_pre_ping": If True, the pool will test the connection for liveness
upon each checkout. - "pool_size": The size of the connection pool to be
maintained. - "max_overflow": The number of connections that can be
opened above the pool_size
setting, when all other connections are in
use. - "pool_recycle": The number of seconds after which a connection is
automatically recycled. This is required for MySQL, which removes
connections after 8 hours idle by default. - "pool_timeout": The number
of seconds to wait before giving up on getting a connection from the
pool.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
config
|
|
A dictionary containing the database configuration |
required |
Raises:
Type | Description |
---|---|
|
If there are unsupported parameters for the database |
Example:
from dsg_lib.async_database_functions import database_config
# Define your database configuration config = {
"database_uri":
"postgresql+asyncpg://user:password@localhost/dbname", "echo": True,
"future": True, "pool_pre_ping": True, "pool_size": 5,
"max_overflow": 10, "pool_recycle": 3600, "pool_timeout": 30,
}
# Create a DBConfig instance db_config =
database_config.DBConfig(config)
Source code in dsg_lib/async_database_functions/database_config.py
137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 |
|
get_db_session()
async
¶
This method returns a context manager that provides a new database session.
The session is created using the SQLAlchemy engine from the DBConfig instance, and it does not expire on commit. The session is of type AsyncSession.
This method should be used with the async with
statement.
Yields:
Name | Type | Description |
---|---|---|
AsyncSession |
A new SQLAlchemy asynchronous session. |
Raises:
Type | Description |
---|---|
|
If a database error occurs. |
Example:
from dsg_lib.async_database_functions import database_config
# Define your database configuration config = {
"database_uri":
"postgresql+asyncpg://user:password@localhost/dbname", "echo": True,
"future": True, "pool_pre_ping": True, "pool_size": 5,
"max_overflow": 10, "pool_recycle": 3600, "pool_timeout": 30,
}
# Create a DBConfig instance db_config =
database_config.DBConfig(config)
# Use the DBConfig instance to get a database session async with
db_config.get_db_session() as session:
# Perform your database operations here pass
Source code in dsg_lib/async_database_functions/database_config.py
203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 |
|
Reference¶
dsg_lib.async_database_functions.async_database
¶
async_database.py.
This module provides classes for managing asynchronous database operations using SQLAlchemy and asyncio.
Classes:
Name | Description |
---|---|
|
Manages the database configuration. |
|
Manages the asynchronous database operations. |
The DBConfig class initializes the database configuration and creates a SQLAlchemy engine and a MetaData instance.
The AsyncDatabase class uses an instance of DBConfig to perform asynchronous database operations. It provides methods to get a database session and to create tables in the database.
This module uses the logger from the dsg_lib.common_functions for logging.
Example:
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
# Create a DBConfig instance
config = {
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
"pool_recycle": 3600,
}
# create database configuration
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
Author: Mike Ryan Date: 2024/05/16 License: MIT
AsyncDatabase
¶
A class used to manage the asynchronous database operations.
Attributes¶
db_config : DBConfig an instance of DBConfig class containing the database configuration Base : Base the declarative base model for SQLAlchemy
Methods¶
get_db_session(): Returns a context manager that provides a new database session. create_tables(): Asynchronously creates all tables in the database.
Source code in dsg_lib/async_database_functions/async_database.py
59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 |
|
__init__(db_config)
¶
Initialize the AsyncDatabase class with an instance of DBConfig.
Parameters: db_config (DBConfig): An instance of DBConfig class containing the database configuration.
Returns: None
Source code in dsg_lib/async_database_functions/async_database.py
78 79 80 81 82 83 84 85 86 87 88 89 |
|
create_tables()
async
¶
This method asynchronously creates all tables in the database.
Parameters: None
Returns: None
Source code in dsg_lib/async_database_functions/async_database.py
103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 |
|
disconnect()
async
¶
This method asynchronously disconnects the database engine.
Parameters: None
Returns: None
Source code in dsg_lib/async_database_functions/async_database.py
125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 |
|
get_db_session()
¶
This method returns a context manager that provides a new database session.
Parameters: None
Returns: contextlib._GeneratorContextManager: A context manager that provides a new database session.
Source code in dsg_lib/async_database_functions/async_database.py
91 92 93 94 95 96 97 98 99 100 101 |
|
Reference¶
Configuration Matrix¶
Create Engine Support Functions by Database Type Confirmed by testing [SQLITE, PostgreSQL] To Be Tested [MySQL, Oracle, MSSQL] and should be considered experimental.
Option | SQLite | PostgreSQL | MySQL | Oracle | MSSQL |
---|---|---|---|---|---|
echo | Yes | Yes | Yes | Yes | Yes |
future | Yes | Yes | Yes | Yes | Yes |
pool_pre_ping | Yes | Yes | Yes | Yes | Yes |
pool_size | No | Yes | Yes | Yes | Yes |
max_overflow | No | Yes | Yes | Yes | Yes |
pool_recycle | Yes | Yes | Yes | Yes | Yes |
pool_timeout | No | Yes | Yes | Yes | Yes |
dsg_lib.async_database_functions.database_operations
¶
This module provides the DatabaseOperations
class for performing CRUD operations on a database using SQLAlchemy's asynchronous session.
The DatabaseOperations
class includes the following methods:
- `execute_one`: Executes a single non-read SQL query asynchronously.
- `execute_many`: Executes multiple non-read SQL queries asynchronously within a single transaction.
- 'read_one_record': Retrieves a single record from the database based on the provided query.
- `read_query`: Executes a fetch query on the database and returns a list of records that match the query.
- `read_multi_query`: Executes multiple fetch queries on the database and returns a dictionary of results for each query.
- `count_query`: Counts the number of records that match a given query.
- `get_column_details`: Gets the details of the columns in a table.
- `get_primary_keys`: Gets the primary keys of a table.
- `get_table_names`: Gets the names of all tables in the database.
Deprecated Methods:
- `create_one`: [Deprecated] Use `execute_one` with an INSERT query instead.
- `create_many`: [Deprecated] Use `execute_many` with INSERT queries instead.
- `update_one`: [Deprecated] Use `execute_one` with an UPDATE query instead.
- `update_many`: [Deprecated] Use `execute_many` with UPDATE queries instead.
- `delete_one`: [Deprecated] Use `execute_one` with a DELETE query instead.
- `delete_many`: [Deprecated] Use `execute_many` with DELETE queries instead.
Each method is designed to handle errors correctly and provide a simple interface for performing database operations.
This module also imports the necessary SQLAlchemy and loguru modules, and the AsyncDatabase
class from the local async_database
module.
Author: Mike Ryan Date: 2024/11/29 License: MIT
DatabaseOperations
¶
This class provides methods for performing CRUD operations on a database using SQLAlchemy's asynchronous session.
The methods include:
execute_one
: Executes a single non-read SQL query asynchronously.execute_many
: Executes multiple non-read SQL queries asynchronously within a single transaction.read_one_record
: Retrieves a single record from the database based on the provided query.read_query
: Executes a fetch query on the database and returns a list of records that match the query.read_multi_query
: Executes multiple fetch queries on the database and returns a dictionary of results for each query.count_query
: Counts the number of records that match a given query.get_column_details
: Gets the details of the columns in a table.get_primary_keys
: Gets the primary keys of a table.get_table_names
: Gets the names of all tables in the database.
Deprecated Methods:
- create_one
: [Deprecated] Use execute_one
with an INSERT query instead.
- create_many
: [Deprecated] Use execute_many
with INSERT queries instead.
- update_one
: [Deprecated] Use execute_one
with an UPDATE query instead.
- delete_one
: [Deprecated] Use execute_one
with a DELETE query instead.
- delete_many
: [Deprecated] Use execute_many
with DELETE queries instead.
Examples:
from sqlalchemy import insert, select
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
# Create a DBConfig instance
config = {
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
"pool_recycle": 3600,
}
# create database configuration
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
# create one record
query = insert(User).values(name='John Doe')
result = await db_ops.execute_one(query)
# read one record
query = select(User).where(User.name == 'John Doe')
record = await db_ops.read_query(query)
Source code in dsg_lib/async_database_functions/database_operations.py
131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 1190 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 1241 1242 1243 1244 1245 1246 1247 1248 1249 1250 1251 1252 1253 1254 1255 1256 1257 1258 1259 1260 1261 1262 1263 1264 1265 1266 1267 1268 1269 1270 1271 1272 1273 1274 1275 1276 1277 1278 1279 1280 1281 1282 1283 1284 1285 1286 1287 1288 1289 1290 1291 1292 1293 1294 1295 1296 1297 1298 1299 1300 1301 1302 1303 1304 1305 1306 1307 1308 1309 1310 1311 1312 1313 1314 1315 1316 1317 1318 1319 1320 1321 1322 1323 |
|
__init__(async_db)
¶
Initializes a new instance of the DatabaseOperations class.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
async_db
|
|
An instance of the |
required |
Example:
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
config = {
# "database_uri": "postgresql+asyncpg://postgres:postgres@postgresdb/postgres",
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
# "pool_pre_ping": True,
# "pool_size": 10,
# "max_overflow": 10,
"pool_recycle": 3600,
# "pool_timeout": 30,
}
db_config = database_config.DBConfig(config)
async_db = async_database.AsyncDatabase(db_config)
db_ops = database_operations.DatabaseOperations(async_db)
Source code in dsg_lib/async_database_functions/database_operations.py
188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 |
|
count_query(query)
async
¶
Executes a count query on the database and returns the number of records that match the query.
This asynchronous method accepts a SQLAlchemy Select
query object and
returns the count of records that match the query. This is particularly
useful for getting the total number of records that satisfy certain
conditions without actually fetching the records themselves.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
query
|
|
A SQLAlchemy |
required |
Returns:
Name | Type | Description |
---|---|---|
int |
The number of records that match the query. |
Raises:
Type | Description |
---|---|
|
If any error occurs during the execution of the query. |
Example
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
# Create a DBConfig instance
config = {
# "database_uri": "postgresql+asyncpg://postgres:postgres@postgresdb/postgres",
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
# "pool_pre_ping": True,
# "pool_size": 10,
# "max_overflow": 10,
"pool_recycle": 3600,
# "pool_timeout": 30,
}
# create database configuration
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
# count query
count = await db_ops.count_query(select(User).where(User.age > 30))
Source code in dsg_lib/async_database_functions/database_operations.py
473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 |
|
create_many(records)
async
¶
This method is deprecated. Use execute_many
with INSERT queries instead.
Adds multiple records to the database.
This asynchronous method accepts a list of record objects and adds them to the database. If the operation is successful, it returns the added records. This method is useful for bulk inserting multiple rows into a database table efficiently.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
records
|
|
A list of instances of the SQLAlchemy |
required |
Returns:
Type | Description |
---|---|
list[Base]: A list of instances of the records that were added to |
|
the database. |
Raises:
Type | Description |
---|---|
|
If any error occurs during the database operation. |
Example
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
# Create a DBConfig instance
config = {
# "database_uri": "postgresql+asyncpg://postgres:postgres@postgresdb/postgres",
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
# "pool_pre_ping": True,
# "pool_size": 10,
# "max_overflow": 10,
"pool_recycle": 3600,
# "pool_timeout": 30,
}
# create database configuration
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
# create many records
records = await db_ops.create_many([User(name='John Doe'), User(name='Jane Doe')])
Source code in dsg_lib/async_database_functions/database_operations.py
945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 |
|
create_one(record)
async
¶
This method is deprecated. Use execute_one
with an INSERT query instead.
Adds a single record to the database.
This asynchronous method accepts a record object and adds it to the database. If the operation is successful, it returns the added record. The method is useful for inserting a new row into a database table.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
record
|
|
An instance of the SQLAlchemy declarative base class |
required |
Returns:
Name | Type | Description |
---|---|---|
Base |
The instance of the record that was added to the database. |
Raises:
Type | Description |
---|---|
|
If any error occurs during the database operation. |
Example
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
# Create a DBConfig instance
config = {
# "database_uri": "postgresql+asyncpg://postgres:postgres@postgresdb/postgres",
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
# "pool_pre_ping": True,
# "pool_size": 10,
# "max_overflow": 10,
"pool_recycle": 3600,
# "pool_timeout": 30,
}
# create database configuration
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
# create one record
record = await db_ops.create_one(User(name='John Doe'))
Source code in dsg_lib/async_database_functions/database_operations.py
871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 |
|
delete_many(table, id_column_name='pkid', id_values=None)
async
¶
This method is deprecated. Use execute_many
with a DELETE query instead.
Deletes multiple records from the specified table in the database.
This method takes a table, an optional id column name, and a list of id values. It deletes the records in the table where the id column matches any of the id values in the list.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
table
|
|
The table from which to delete records. |
required |
id_column_name
|
|
The name of the id column in the table. Defaults to "pkid". |
'pkid'
|
id_values
|
|
A list of id values for the records to delete. Defaults to []. |
None
|
Returns:
Name | Type | Description |
---|---|---|
int |
|
The number of records deleted from the table. |
Example:
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
# Create a DBConfig instance
config = {
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
"pool_recycle": 3600,
}
# create database configuration
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
# Delete multiple records
deleted_count = await db_ops.delete_many(User, 'id', [1, 2, 3])
print(f"Deleted {deleted_count} records.")
Source code in dsg_lib/async_database_functions/database_operations.py
1236 1237 1238 1239 1240 1241 1242 1243 1244 1245 1246 1247 1248 1249 1250 1251 1252 1253 1254 1255 1256 1257 1258 1259 1260 1261 1262 1263 1264 1265 1266 1267 1268 1269 1270 1271 1272 1273 1274 1275 1276 1277 1278 1279 1280 1281 1282 1283 1284 1285 1286 1287 1288 1289 1290 1291 1292 1293 1294 1295 1296 1297 1298 1299 1300 1301 1302 1303 1304 1305 1306 1307 1308 1309 1310 1311 1312 1313 1314 1315 1316 1317 1318 1319 1320 1321 1322 1323 |
|
delete_one(table, record_id)
async
¶
This method is deprecated. Use execute_one
with a DELETE query instead.
Deletes a single record from the database based on the provided table and record ID.
This asynchronous method accepts a SQLAlchemy Table
object and a
record ID. It attempts to delete the record with the given ID from the
specified table. If the record is successfully deleted, it returns a
success message. If no record with the given ID is found, it returns an
error message.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
table
|
|
An instance of the SQLAlchemy |
required |
deleted.
|
record_id (str
|
The ID of the record to be deleted. |
required |
Returns:
Name | Type | Description |
---|---|---|
dict |
A dictionary containing a success message if the record was |
|
deleted successfully, or an error message if the record was not |
||
found or an exception occurred. |
Raises:
Type | Description |
---|---|
|
If any error occurs during the delete operation. |
Example
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
# Create a DBConfig instance
config = {
# "database_uri": "postgresql+asyncpg://postgres:postgres@postgresdb/postgres",
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
# "pool_pre_ping": True,
# "pool_size": 10,
# "max_overflow": 10,
"pool_recycle": 3600,
# "pool_timeout": 30,
}
# create database configuration
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
# delete one record
result = await db_ops.delete_one(User, 1)
Source code in dsg_lib/async_database_functions/database_operations.py
1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 1190 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 |
|
execute_many(queries)
async
¶
Executes multiple non-read SQL queries asynchronously within a single transaction.
This method executes a list of SQL statements that modify the database, such as multiple INSERTs, UPDATEs, or DELETEs. All queries are executed within the same transaction, which is committed if all succeed, or rolled back if any fail.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
queries
|
|
A list of tuples, each containing
a query and an optional dictionary of parameter values. Each tuple should be of the form
|
required |
Returns:
Type | Description |
---|---|
|
Union[str, Dict[str, str]]: "complete" if all queries executed and committed successfully, |
|
or an error dictionary if an exception occurred. |
Example
from sqlalchemy import insert
queries = [
(insert(User), {'name': 'User1'}),
(insert(User), {'name': 'User2'}),
(insert(User), {'name': 'User3'}),
]
result = await db_ops.execute_many(queries)
Source code in dsg_lib/async_database_functions/database_operations.py
824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 |
|
execute_one(query, values=None)
async
¶
Executes a single non-read SQL query asynchronously.
This method executes a single SQL statement that modifies the database, such as INSERT, UPDATE, or DELETE. It handles the execution within an asynchronous session and commits the transaction upon success.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
query
|
|
An SQLAlchemy query object representing the SQL statement to execute. |
required |
values
|
|
A dictionary of parameter values to bind to the query. Defaults to None. |
None
|
Returns:
Type | Description |
---|---|
|
Union[str, Dict[str, str]]: "complete" if the query executed and committed successfully, |
|
or an error dictionary if an exception occurred. |
Example
from sqlalchemy import insert
query = insert(User).values(name='John Doe')
result = await db_ops.execute_one(query)
Source code in dsg_lib/async_database_functions/database_operations.py
785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 |
|
get_columns_details(table)
async
¶
Retrieves the details of the columns of a given table.
This asynchronous method accepts a table object and returns a dictionary. Each key in the dictionary is a column name from the table, and the corresponding value is another dictionary containing details about that column, such as type, if it's nullable, if it's a primary key, if it's unique, its autoincrement status, and its default value.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
table
|
|
An instance of the SQLAlchemy Table class |
required |
Returns:
Name | Type | Description |
---|---|---|
dict |
A dictionary where each key is a column name, and each value |
|
is a dictionary with the column's details. |
Raises:
Type | Description |
---|---|
|
If any error occurs during the database operation. |
Example:
from sqlalchemy import Table, MetaData, Column,
Integer, String from dsg_lib.async_database_functions import module_name metadata = MetaData()
my_table = Table('my_table', metadata,
Column('id', Integer, primary_key=True), Column('name',
String))
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
# Create a DBConfig instance
config = {
# "database_uri": "postgresql+asyncpg://postgres:postgres@postgresdb/postgres",
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
# "pool_pre_ping": True,
# "pool_size": 10,
# "max_overflow": 10,
"pool_recycle": 3600,
# "pool_timeout": 30,
}
# create database configuration
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
# get columns details
columns = await db_ops.get_columns_details(my_table)
Source code in dsg_lib/async_database_functions/database_operations.py
235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 |
|
get_primary_keys(table)
async
¶
Retrieves the primary keys of a given table.
This asynchronous method accepts a table object and returns a list containing the names of its primary keys. It is useful for understanding the structure of the table and for operations that require knowledge of the primary keys.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
table
|
|
An instance of the SQLAlchemy Table class |
required |
Returns:
Name | Type | Description |
---|---|---|
list |
A list containing the names of the primary keys of the table. |
Raises:
Type | Description |
---|---|
|
If any error occurs during the database operation. |
Example
from sqlalchemy import Table, MetaData, Column, Integer,
String from dsg_lib.async_database_functions import module_name metadata = MetaData()
my_table = Table('my_table', metadata,
Column('id', Integer, primary_key=True),
Column('name', String, primary_key=True))
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
# Create a DBConfig instance
config = {
# "database_uri": "postgresql+asyncpg://postgres:postgres@postgresdb/postgres",
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
# "pool_pre_ping": True,
# "pool_size": 10,
# "max_overflow": 10,
"pool_recycle": 3600,
# "pool_timeout": 30,
}
# create database configuration
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
# get primary keys
primary_keys = await db_ops.get_primary_keys(my_table)
Source code in dsg_lib/async_database_functions/database_operations.py
332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 |
|
get_table_names()
async
¶
Retrieves the names of all tables in the database.
This asynchronous method returns a list containing the names of all tables in the database. It is useful for database introspection, allowing the user to know which tables are available in the current database context.
Returns:
Name | Type | Description |
---|---|---|
list |
A list containing the names of all tables in the database. |
Raises:
Type | Description |
---|---|
|
If any error occurs during the database operation. |
Example
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
# Create a DBConfig instance
config = {
# "database_uri": "postgresql+asyncpg://postgres:postgres@postgresdb/postgres",
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
# "pool_pre_ping": True,
# "pool_size": 10,
# "max_overflow": 10,
"pool_recycle": 3600,
# "pool_timeout": 30,
}
# create database configuration
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
# get table names
table_names = await db_ops.get_table_names()
Source code in dsg_lib/async_database_functions/database_operations.py
407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 |
|
read_multi_query(queries)
async
¶
Executes multiple fetch queries on the database and returns a dictionary of results for each query.
This asynchronous method takes a dictionary where each key is a query
name and each value is a SQLAlchemy Select
query object. The method executes each
query and returns a dictionary where each key is the query name, and the
corresponding value is a list of records that match that query.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
queries
|
|
A dictionary of SQLAlchemy |
required |
Returns:
Name | Type | Description |
---|---|---|
dict |
A dictionary where each key is a query name and each value is |
|
a list of records that match the query. |
Raises:
Type | Description |
---|---|
|
If any error occurs during the execution of the queries. |
Example
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
# Create a DBConfig instance
config = {
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
"pool_recycle": 3600,
}
# create database configuration
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
# read multi query
queries = {
"query1": select(User).where(User.age > 30),
"query2": select(User).where(User.age < 20),
}
results = await db_ops.read_multi_query(queries)
Source code in dsg_lib/async_database_functions/database_operations.py
707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 |
|
read_one_record(query)
async
¶
Retrieves a single record from the database based on the provided query.
This asynchronous method accepts a SQL query object and returns the first record that matches the query. If no record matches the query, it returns None. This method is useful for fetching specific data when the expected result is a single record.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
query
|
|
An instance of the SQLAlchemy Select class, |
required |
Returns:
Name | Type | Description |
---|---|---|
Result |
The first record that matches the query or None if no record matches. |
Raises:
Type | Description |
---|---|
|
If any error occurs during the database operation. |
Example
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
# Create a DBConfig instance
config = {
# "database_uri": "postgresql+asyncpg://postgres:postgres@postgresdb/postgres",
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
# "pool_pre_ping": True,
# "pool_size": 10,
# "max_overflow": 10,
"pool_recycle": 3600,
# "pool_timeout": 30,
}
# create database configuration
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
# read one record
record = await db_ops.read_one_record(select(User).where(User.name == 'John Doe'))
Source code in dsg_lib/async_database_functions/database_operations.py
548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 |
|
read_query(query)
async
¶
Executes a fetch query on the database and returns a list of records that match the query.
This asynchronous method accepts a SQLAlchemy Select
query object.
It returns a list of records that match the query.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
query
|
|
A SQLAlchemy |
required |
Returns:
Name | Type | Description |
---|---|---|
list |
A list of records that match the query. |
Raises:
Type | Description |
---|---|
|
If any error occurs during the execution of the query. |
Example
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
# Create a DBConfig instance
config = {
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
"pool_recycle": 3600,
}
# create database configuration
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
# read query
records = await db_ops.read_query(select(User).where(User.age > 30))
Source code in dsg_lib/async_database_functions/database_operations.py
625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 |
|
update_one(table, record_id, new_values)
async
¶
This method is deprecated. Use execute_one
with an UPDATE query instead.
Updates a single record in the database identified by its ID.
This asynchronous method takes a SQLAlchemy Table
object, a record ID,
and a dictionary of new values to update the record. It updates the
specified record in the given table with the new values. The method does
not allow updating certain fields, such as 'id' or 'date_created'.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
table
|
|
The SQLAlchemy |
required |
in
|
the database. record_id (str
|
The ID of the record to be |
required |
updated.
|
new_values (dict
|
A dictionary containing the fields to |
required |
Returns:
Name | Type | Description |
---|---|---|
Base |
The updated record if successful; otherwise, an error |
|
dictionary. |
Raises:
Type | Description |
---|---|
|
If any error occurs during the update operation. |
Example
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
# Create a DBConfig instance
config = {
# "database_uri": "postgresql+asyncpg://postgres:postgres@postgresdb/postgres",
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
# "pool_pre_ping": True,
# "pool_size": 10,
# "max_overflow": 10,
"pool_recycle": 3600,
# "pool_timeout": 30,
}
# create database configuration
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
# update one record
record = await db_ops.update_one(User, 1, {'name': 'John Smith'})
Source code in dsg_lib/async_database_functions/database_operations.py
1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 |
|
handle_exceptions(ex)
¶
Handles exceptions for database operations.
This function checks the type of the exception, logs an appropriate error message, and returns a dictionary containing the error details.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
ex
|
|
The exception to handle. |
required |
Returns:
Name | Type | Description |
---|---|---|
dict |
|
A dictionary containing the error details. The dictionary has two |
keys |
|
'error' and 'details'. |
Example:
from dsg_lib.async_database_functions import database_operations
try:
# Some database operation that might raise an exception pass
except Exception as ex:
error_details = database_operations.handle_exceptions(ex)
print(error_details)
Source code in dsg_lib/async_database_functions/database_operations.py
70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 |
|
Ended: Database Functions
Recipes ↵
Full Example of FastAPI with Aync Database and Endpoints¶
You can find this in the examples folder of the repository.
Install dependencies¶
pip install dsg_lib[all] tqdm
Make App¶
Copy the fastapi code below after installing. (assumption is main.py)
# -*- coding: utf-8 -*-
"""
Author: Mike Ryan
Date: 2024/05/16
License: MIT
"""
import datetime
import secrets
import time
from contextlib import asynccontextmanager
from fastapi import Body, FastAPI, Query
from fastapi.responses import RedirectResponse
# from loguru import logger
# import logging as logger
from . import logger
from pydantic import BaseModel, EmailStr
from sqlalchemy import Column, ForeignKey, Select, String
from sqlalchemy.orm import relationship
from tqdm import tqdm
from dsg_lib.async_database_functions import (
async_database,
base_schema,
database_config,
database_operations,
)
from dsg_lib.common_functions import logging_config
from dsg_lib.fastapi_functions import system_health_endpoints # , system_tools_endpoints
logging_config.config_log(
logging_level="INFO", log_serializer=False, log_name="log.log"
)
# Create a DBConfig instance
config = {
# "database_uri": "postgresql+asyncpg://postgres:postgres@postgresdb/postgres",
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
# "pool_pre_ping": True,
# "pool_size": 10,
# "max_overflow": 10,
"pool_recycle": 3600,
# "pool_timeout": 30,
}
# create database configuration
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
class User(base_schema.SchemaBaseSQLite, async_db.Base):
"""
User table storing user details like first name, last name, and email
"""
__tablename__ = "users"
__table_args__ = {
"comment": "User table storing user details like first name, last name, and email"
}
first_name = Column(String(50), unique=False, index=True) # First name of the user
last_name = Column(String(50), unique=False, index=True) # Last name of the user
email = Column(
String(200), unique=True, index=True, nullable=True
) # Email of the user, must be unique
addresses = relationship(
"Address", order_by="Address.pkid", back_populates="user"
) # Relationship to the Address class
class Address(base_schema.SchemaBaseSQLite, async_db.Base):
"""
Address table storing address details like street, city, and zip code
"""
__tablename__ = "addresses"
__table_args__ = {
"comment": "Address table storing address details like street, city, and zip code"
}
street = Column(String(200), unique=False, index=True) # Street of the address
city = Column(String(200), unique=False, index=True) # City of the address
zip = Column(String(50), unique=False, index=True) # Zip code of the address
user_id = Column(
String(36), ForeignKey("users.pkid")
) # Foreign key to the User table
user = relationship(
"User", back_populates="addresses"
) # Relationship to the User class
@asynccontextmanager
async def lifespan(app: FastAPI):
logger.info("starting up")
# Create the tables in the database
await async_db.create_tables()
create_users = True
if create_users:
await create_a_bunch_of_users(single_entry=2000, many_entries=20000)
yield
logger.info("shutting down")
await async_db.disconnect()
logger.info("database disconnected")
print("That's all folks!")
# Create an instance of the FastAPI class
app = FastAPI(
title="FastAPI Example", # The title of the API
description="This is an example of a FastAPI application using the DevSetGo Toolkit.", # A brief description of the API
version="0.1.0", # The version of the API
docs_url="/docs", # The URL where the API documentation will be served
redoc_url="/redoc", # The URL where the ReDoc documentation will be served
openapi_url="/openapi.json", # The URL where the OpenAPI schema will be served
debug=True, # Enable debug mode
middleware=[], # A list of middleware to include in the application
routes=[], # A list of routes to include in the application
lifespan=lifespan, # this is the replacement for the startup and shutdown events
)
@app.get("/")
async def root():
"""
Root endpoint of API
Returns:
Redrects to openapi document
"""
# redirect to openapi docs
logger.info("Redirecting to OpenAPI docs")
response = RedirectResponse(url="/docs")
return response
config_health = {
"enable_status_endpoint": True,
"enable_uptime_endpoint": True,
"enable_heapdump_endpoint": True,
}
app.include_router(
system_health_endpoints.create_health_router(config=config_health),
prefix="/api/health",
tags=["system-health"],
)
async def create_a_bunch_of_users(single_entry=0, many_entries=0):
logger.info(f"single_entry: {single_entry}")
await async_db.create_tables()
# Create a list to hold the user data
# Create a loop to generate user data
for _ in tqdm(range(single_entry), desc="executing one"):
value = secrets.token_hex(16)
user = User(
first_name=f"First{value}",
last_name=f"Last{value}",
email=f"user{value}@example.com",
)
logger.info(f"created_users: {user}")
await db_ops.create_one(user)
users = []
# Create a loop to generate user data
for i in tqdm(range(many_entries), desc="executing many"):
value_one = secrets.token_hex(4)
value_two = secrets.token_hex(8)
user = User(
first_name=f"First{value_one}{i}{value_two}",
last_name=f"Last{value_one}{i}{value_two}",
email=f"user{value_one}{i}{value_two}@example.com",
)
logger.info(f"created_users: {user.first_name}")
users.append(user)
# Use db_ops to add the users to the database
await db_ops.create_many(users)
@app.get("/database/get-primary-key", tags=["Database Examples"])
async def table_primary_key():
logger.info("Getting primary key of User table")
pk = await db_ops.get_primary_keys(User)
logger.info(f"Primary key of User table: {pk}")
return {"pk": pk}
@app.get("/database/get-column-details", tags=["Database Examples"])
async def table_column_details():
logger.info("Getting column details of User table")
columns = await db_ops.get_columns_details(User)
logger.info(f"Column details of User table: {columns}")
return {"columns": columns}
@app.get("/database/get-tables", tags=["Database Examples"])
async def table_table_details():
logger.info("Getting table names")
tables = await db_ops.get_table_names()
logger.info(f"Table names: {tables}")
return {"table_names": tables}
@app.get("/database/get-count", tags=["Database Examples"])
async def get_count():
logger.info("Getting count of users")
count = await db_ops.count_query(Select(User))
logger.info(f"Count of users: {count}")
return {"count": count}
@app.get("/database/get-all", tags=["Database Examples"])
async def get_all(offset: int = 0, limit: int = Query(100, le=100000, ge=1)):
logger.info(f"Getting all users with offset {offset} and limit {limit}")
records = await db_ops.read_query(Select(User).offset(offset).limit(limit))
logger.info(f"Retrieved {len(records)} users")
return {"records": records}
@app.get("/database/get-one-record", tags=["Database Examples"])
async def read_one_record(record_id: str):
logger.info(f"Reading one record with id {record_id}")
record = await db_ops.read_one_record(Select(User).where(User.pkid == record_id))
logger.info(f"Record with id {record_id}: {record}")
return record
class UserBase(BaseModel):
first_name: str
last_name: str
email: EmailStr
class UserCreate(UserBase):
pass
@app.post("/database/create-one-record", status_code=201, tags=["Database Examples"])
async def create_one_record(new_user: UserCreate):
logger.info(f"Creating one record: {new_user}")
user = User(**new_user.dict())
record = await db_ops.create_one(user)
logger.info(f"Created record: {record}")
return record
@app.post("/database/create-many-records", status_code=201, tags=["Database Examples"])
async def create_many_records(number_of_users: int = Query(100, le=1000, ge=1)):
logger.info(f"Creating {number_of_users} records")
t0 = time.time()
users = []
# Create a loop to generate user data
for i in tqdm(range(number_of_users), desc="executing many"):
value_one = secrets.token_hex(4)
value_two = secrets.token_hex(8)
user = User(
first_name=f"First{value_one}{i}{value_two}",
last_name=f"Last{value_one}{i}{value_two}",
email=f"user{value_one}{i}{value_two}@example.com",
)
logger.info(f"Created user: {user.first_name}")
users.append(user)
# Use db_ops to add the users to the database
await db_ops.create_many(users)
t1 = time.time()
process_time = format(t1 - t0, ".4f")
logger.info(f"Created {number_of_users} records in {process_time} seconds")
return {"number_of_users": number_of_users, "process_time": process_time}
@app.put("/database/update-one-record", status_code=200, tags=["Database Examples"])
async def update_one_record(
id: str = Body(
...,
description="UUID to update",
examples=["6087cce8-0bdd-48c2-ba96-7d557dae843e"],
),
first_name: str = Body(..., examples=["Agent"]),
last_name: str = Body(..., examples=["Smith"]),
email: str = Body(..., examples=["jim@something.com"]),
):
logger.info(f"Updating one record with id {id}")
# adding date_updated to new_values as it is not supported in sqlite \
# and other database may not either.
new_values = {
"first_name": first_name,
"last_name": last_name,
"email": email,
"date_updated": datetime.datetime.now(datetime.timezone.utc),
}
record = await db_ops.update_one(table=User, record_id=id, new_values=new_values)
logger.info(f"Updated record with id {id}")
return record
@app.delete("/database/delete-one-record", status_code=200, tags=["Database Examples"])
async def delete_one_record(record_id: str = Body(...)):
logger.info(f"Deleting one record with id {record_id}")
record = await db_ops.delete_one(table=User, record_id=record_id)
logger.info(f"Deleted record with id {record_id}")
return record
@app.delete(
"/database/delete-many-records-aka-this-is-a-bad-idea",
status_code=201,
tags=["Database Examples"],
)
async def delete_many_records(
id_values: list = Body(...), id_column_name: str = "pkid"
):
logger.info(f"Deleting many records with ids {id_values}")
record = await db_ops.delete_many(
table=User, id_column_name="pkid", id_values=id_values
)
logger.info(f"Deleted records with ids {id_values}")
return record
@app.get(
"/database/get-list-of-records-to-paste-into-delete-many-records",
tags=["Database Examples"],
)
async def read_list_of_records(
offset: int = Query(0, le=1000, ge=0), limit: int = Query(100, le=10000, ge=1)
):
logger.info(f"Reading list of records with offset {offset} and limit {limit}")
records = await db_ops.read_query(Select(User), offset=offset, limit=limit)
records_list = []
for record in records:
records_list.append(record.pkid)
logger.info(f"Read list of records: {records_list}")
return records_list
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="127.0.0.1", port=5000)
Run Code¶
In the console (linux) run the code below. Open browser to http://127.0.0.1:5000 to see app.
python3 main.py
Examples¶
Here are a few examples of how to use the database functions
Asyncio Script Example¶
Example of how to use in a script
import asyncio
from sqlalchemy import select
from dsg_lib.async_database_functions import database_config, async_database, database_operations
# Configuration
config = {
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
"pool_recycle": 3600,
}
# Create a DBConfig instance
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
# User class
class User(async_db.Base):
__tablename__ = "users"
first_name = Column(String, unique=False, index=True)
last_name = Column(String, unique=False, index=True)
email = Column(String, unique=True, index=True, nullable=True)
# Async function to get all users
async def get_all_users():
# Create a select query
query = select(User)
# Execute the query and fetch all results
users = await db_ops.read_query(query)
# Print the users
for user in users:
print(f"User: {user.first_name} {user.last_name}, Email: {user.email}")
# Run the async function
asyncio.run(get_all_users())
FastAPI Example¶
# -*- coding: utf-8 -*-
from contextlib import asynccontextmanager
from fastapi import FastAPI
from fastapi.responses import RedirectResponse
from loguru import logger
from tqdm import tqdm
from dsg_lib import logging_config
logging_config.config_log(
logging_level="Debug", log_serializer=False, log_name="log.log"
)
@asynccontextmanager
async def lifespan(app: FastAPI):
logger.info("starting up")
# Create the tables in the database
await async_db.create_tables()
create_users = True
if create_users:
await create_a_bunch_of_users(single_entry=23, many_entries=100)
yield
logger.info("shutting down")
# Create an instance of the FastAPI class
app = FastAPI(
title="FastAPI Example", # The title of the API
description="This is an example of a FastAPI application using the DevSetGo Toolkit.", # A brief description of the API
version="0.1.0", # The version of the API
docs_url="/docs", # The URL where the API documentation will be served
redoc_url="/redoc", # The URL where the ReDoc documentation will be served
openapi_url="/openapi.json", # The URL where the OpenAPI schema will be served
debug=True, # Enable debug mode
middleware=[], # A list of middleware to include in the application
routes=[], # A list of routes to include in the application
lifespan=lifespan,
)
@app.get("/")
async def root():
"""
Root endpoint of API
Returns:
Redrects to openapi document
"""
# redirect to openapi docs
logger.info("Redirecting to OpenAPI docs")
response = RedirectResponse(url="/docs")
return response
from sqlalchemy import Column, Delete, Select, String, Update
from dsg_lib import (
async_database,
base_schema,
database_config,
database_operations,
)
# Create a DBConfig instance
config = {
# "database_uri": "postgresql+asyncpg://postgres:postgres@postgresdb/postgres",
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
# "pool_pre_ping": True,
# "pool_size": 10,
# "max_overflow": 10,
"pool_recycle": 3600,
# "pool_timeout": 30,
}
db_config = database_config.DBConfig(config)
# Create an AsyncDatabase instance
async_db = async_database.AsyncDatabase(db_config)
# Create a DatabaseOperations instance
db_ops = database_operations.DatabaseOperations(async_db)
# User class inherits from SchemaBase and async_db.Base
# This class represents the User table in the database
class User(base_schema.SchemaBase, async_db.Base):
__tablename__ = "users" # Name of the table in the database
# Define the columns of the table
first_name = Column(String, unique=False, index=True) # First name of the user
last_name = Column(String, unique=False, index=True) # Last name of the user
email = Column(
String, unique=True, index=True, nullable=True
) # Email of the user, must be unique
async def create_a_bunch_of_users(single_entry=0, many_entries=0):
logger.info(f"single_entry: {single_entry}")
await async_db.create_tables()
# Create a list to hold the user data
# Create a loop to generate user data
for i in tqdm(range(single_entry), desc="executing one"):
value = secrets.token_hex(16)
user = User(
first_name=f"First{value}",
last_name=f"Last{value}",
email=f"user{value}@example.com",
)
logger.info(f"created_users: {user}")
await db_ops.create_one(user)
users = []
# Create a loop to generate user data
for i in tqdm(range(many_entries), desc="executing many"):
value_one = secrets.token_hex(4)
value_two = secrets.token_hex(8)
user = User(
first_name=f"First{value_one}{i}{value_two}",
last_name=f"Last{value_one}{i}{value_two}",
email=f"user{value_one}{i}{value_two}@example.com",
)
logger.info(f"created_users: {user.first_name}")
users.append(user)
# Use db_ops to add the users to the database
await db_ops.create_many(users)
@app.get("/database/get-count")
async def get_count():
count = await db_ops.count_query(Select(User))
return {"count": count}
# endpoint to get list of user
@app.get("/database/get-all")
async def get_all(offset: int = 0, limit: int = 100):
records = await db_ops.read_query(Select(User).offset(offset).limit(limit))
return {"records": records}
@app.get("/database/get-primary-key")
async def table_primary_key():
pk = await db_ops.get_primary_keys(User)
return {"pk": pk}
@app.get("/database/get-column-details")
async def table_column_details():
columns = await db_ops.get_columns_details(User)
return {"columns": columns}
@app.get("/database/get-tables")
async def table_table_details():
tables = await db_ops.get_table_names()
return {"table_names": tables}
@app.get("/database/get-one-record")
async def get_one_record(record_id: str):
record = await db_ops.get_one_record(Select(User).where(User.pkid == record_id))
return {"record": record}
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="127.0.0.1", port=5000)
Configuration Examples¶
# SQLite in memory database
config = {
"database_uri": "sqlite+aiosqlite:///:memory:?cache=shared",
"echo": False,
"future": True,
"pool_recycle": 3600,
}
# PostgreSQL database
config = {
"database_uri": "postgresql+asyncpg://postgres:postgres@postgresdb/postgres",
"echo": False,
"future": True,
"pool_recycle": 3600,
}
# MySQL database
config = {
"database_uri": "mysql+aiomysql://root:root@localhost/test",
"echo": False,
"future": True,
"pool_recycle": 3600,
}
# SQL Server database
config = {
"database_uri": "mssql+aiomssql://sa:yourStrong(!)Password@localhost:1433/master",
"echo": False,
"future": True,
"pool_recycle": 3600,
}
# Oracle database
config = {
"database_uri": "oracle+oracledb_async://scott:tiger@localhost/?service_name=XEPDB1",
"echo": False,
"future": True,
"pool_recycle": 3600,
}
Logging Example¶
# -*- coding: utf-8 -*-
"""
Author: Mike Ryan
Date: 2024/05/16
License: MIT
"""
import logging
import multiprocessing
import secrets
import threading
# from loguru import logger
# import logging as logger
from . import logger
from tqdm import tqdm
from dsg_lib.common_functions import logging_config
# Configure logging as before
logging_config.config_log(
logging_directory='log',
log_name='log',
logging_level='DEBUG',
log_rotation='100 MB',
log_retention='10 days',
log_backtrace=True,
log_serializer=True,
log_diagnose=True,
# app_name='my_app',
# append_app_name=True,
intercept_standard_logging=True,
enqueue=True,
)
# @logger.catch
def div_zero(x, y):
try:
return x / y
except ZeroDivisionError as e:
logger.error(f'{e}')
logging.error(f'{e}')
# @logger.catch
def div_zero_two(x, y):
try:
return x / y
except ZeroDivisionError as e:
logger.error(f'{e}')
logging.error(f'{e}')
def log_big_string(lqty=100, size=256):
big_string = secrets.token_urlsafe(size)
for _ in range(lqty):
logging.debug(f'Lets make this a big message {big_string}')
div_zero(x=1, y=0)
div_zero_two(x=1, y=0)
# after configuring logging
# use loguru to log messages
logger.debug('This is a loguru debug message')
logger.info('This is an loguru info message')
logger.error('This is an loguru error message')
logger.warning('This is a loguru warning message')
logger.critical('This is a loguru critical message')
# will intercept all standard logging messages also
logging.debug('This is a standard logging debug message')
logging.info('This is an standard logging info message')
logging.error('This is an standard logging error message')
logging.warning('This is a standard logging warning message')
logging.critical('This is a standard logging critical message')
def worker(wqty=1000, lqty=100, size=256):
for _ in tqdm(range(wqty), ascii=True, leave=True): # Adjusted for demonstration
log_big_string(lqty=lqty, size=size)
def main(wqty: int = 100, lqty: int = 10, size: int = 256, workers: int = 16, thread_test: bool = False, process_test: bool = False):
if process_test:
processes = []
# Create worker processes
for _ in tqdm(range(workers), desc="Multi-Processing Start", leave=True):
p = multiprocessing.Process(
target=worker, args=(wqty, lqty, size,))
processes.append(p)
p.start()
for p in tqdm((processes), desc="Multi-Processing Start", leave=False):
p.join(timeout=60) # Timeout after 60 seconds
if p.is_alive():
logger.error(f"Process {p.name} is hanging. Terminating.")
p.terminate()
p.join()
if thread_test:
threads = []
for _ in tqdm(range(workers), desc="Threading Start", leave=True): # Create worker threads
t = threading.Thread(target=worker, args=(wqty, lqty, size,))
threads.append(t)
t.start()
for t in tqdm(threads, desc="Threading Gather", leave=False):
t.join()
if __name__ == "__main__":
from time import time
start = time()
main(wqty=5, lqty=50, size=64, workers=8, thread_test=False, process_test=True)
print(f"Execution time: {time()-start:.2f} seconds")
Patterns¶
# -*- coding: utf-8 -*-
"""
Author: Mike Ryan
Date: 2024/05/16
License: MIT
"""
import pprint
from random import randint
from dsg_lib.common_functions.patterns import pattern_between_two_char
ASCII_LIST = [
" ",
"!",
'""',
"#",
"$",
"%",
"&",
"'",
"(",
")",
"*",
"+",
",",
"-",
".",
"/",
"0",
"1",
"2",
"3",
"4",
"5",
"6",
"7",
"8",
"9",
":",
";",
"<",
"=",
">",
"?",
"@",
"A",
"B",
"C",
"D",
"E",
"F",
"G",
"H",
"I",
"J",
"K",
"L",
"M",
"N",
"O",
"P",
"Q",
"R",
"S",
"T",
"U",
"V",
"W",
"X",
"Y",
"Z",
"[",
"\\",
"]",
"^",
"_",
"`",
"a",
"b",
"c",
"d",
"e",
"f",
"g",
"h",
"i",
"j",
"k",
"l",
"m",
"n",
"o",
"p",
"q",
"r",
"s",
"t",
"u",
"v",
"w",
"x",
"y",
"z",
"{",
"|",
"}",
"~",
"€",
"‚",
"ƒ",
"„",
"…",
"†",
"‡",
"ˆ",
"‰",
"Š",
"‹",
"Œ",
"Ž",
"‘",
"’",
"“",
"”",
"•",
"–",
"—",
"˜",
"™",
"š",
"›",
"œ",
"ž",
"Ÿ",
"¡",
"¢",
"£",
"¤",
"¥",
"¦",
"§",
"¨",
"©",
"ª",
"«",
"¬",
"®",
"¯",
"°",
"±",
"²",
"³",
"´",
"µ",
"¶",
"·",
"¸",
"¹",
"º",
"»",
"¼",
"½",
"¾",
"¿",
"À",
"Á",
"Â",
"Ã",
"Ä",
"Å",
"Æ",
"Ç",
"È",
"É",
"Ê",
"Ë",
"Ì",
"Í",
"Î",
"Ï",
"Ð",
"Ñ",
"Ò",
"Ó",
"Ô",
"Õ",
"Ö",
"×",
"Ø",
"Ù",
"Ú",
"Û",
"Ü",
"Ý",
"Þ",
"ß",
"à",
"á",
"â",
"ã",
"ä",
"å",
"æ",
"ç",
"è",
"é",
"ê",
"ë",
"ì",
"í",
"î",
"ï",
"ð",
"ñ",
"ò",
"ó",
"ô",
"õ",
"ö",
"÷",
"ø",
"ù",
"ú",
"û",
"ü",
"ý",
"þ",
"ÿ",
]
pp = pprint.PrettyPrinter(indent=4)
def pattern_find(left_char: str, right_char: str, text_block: str):
data = pattern_between_two_char(text_block, left_char, right_char)
pp.pprint(data)
def run_examples():
text_block = "Lfound oneR Lfound twoR"
left_char = "L"
right_char = "R"
pattern_find(left_char=left_char, right_char=right_char, text_block=text_block)
for _ in range(100):
long_input = "xyz" * randint(100, 100000)
long_text = f"{long_input}abc<one>123<two>456<three>{long_input}"
result = pattern_between_two_char(
text_string=long_text, left_characters="<", right_characters=">"
)
print(result["found"])
if __name__ == "__main__":
run_examples()
Validating Email Addresses¶
Example of how to use in a script
# -*- coding: utf-8 -*-
"""
This module is used to validate a list of email addresses using various configurations.
The module imports the `validate_email_address` function from the `dsg_lib.common_functions.email_validation`
module and uses it to validate a list of email addresses. The email addresses and configurations are hard-coded
into the module.
The module measures the time taken to validate all the email addresses with all the configurations and prints
the results in a sorted order.
The module can be run as a standalone script.
Example:
$ python validate_emails.py
Attributes:
email_addresses (list of str): A list of email addresses to validate.
configurations (list of dict): A list of configurations to use for validation. Each configuration is a
dictionary with the following keys:
- check_deliverability (bool): Whether to check if the email address is deliverable.
- test_environment (bool): Whether the function is being run in a test environment.
- allow_smtputf8 (bool): Whether to allow non-ASCII characters in the email address.
- allow_empty_local (bool): Whether to allow email addresses with an empty local part.
- allow_quoted_local (bool): Whether to allow email addresses with a quoted local part.
- allow_display_name (bool): Whether to allow email addresses with a display name.
- allow_domain_literal (bool): Whether to allow email addresses with a domain literal.
- globally_deliverable (bool): Whether the email address should be globally deliverable.
- timeout (int): The timeout for the validation in seconds.
- dns_type (str): The type of DNS to use for the validation. Can be 'dns' or 'timeout'.
Functions:
validate_email_address(email: str, **kwargs: dict) -> dict: Validates an email address using the provided
configuration and returns a dictionary with the results.
Author: Mike Ryan
Date: 2024/05/16
License: MIT
"""
import pprint
import time
from dsg_lib.common_functions.email_validation import validate_email_address
if __name__ == "__main__":
# create a list of email addresses to check if valid
email_addresses = [
"bob@devsetgo.com",
"bob@devset.go",
"foo@yahoo.com",
"bob@gmail.com",
"very fake@devsetgo.com",
"jane.doe@example.com",
"john_doe@example.co.uk",
"user.name+tag+sorting@example.com",
"x@example.com", # shortest possible email address
"example-indeed@strange-example.com",
"admin@mailserver1", # local domain name with no TLD
"example@s.example", # see the list of Internet top-level domains
'" "@example.org', # space between the quotes
'"john..doe"@example.org', # quoted double dot
"mailhost!username@example.org", # bangified host route used for uucp mailers
"user%example.com@example.org", # percent sign in local part
"user-@example.org", # valid due to the last character being an allowed character
# Invalid email addresses
"Abc.example.com", # no @ character
"A@b@c@example.com", # only one @ is allowed outside quotation marks
'a"b(c)d,e:f;g<h>i[j\\k]l@example.com', # none of the special characters in this local part are allowed outside quotation marks
'just"not"right@example.com', # quoted strings must be dot separated or the only element making up the local-part
'this is"not\\allowed@example.com', # spaces, quotes, and backslashes may only exist when within quoted strings and preceded by a backslash
'this\\ still\\"not\\\\allowed@example.com', # even if escaped (preceded by a backslash), spaces, quotes, and backslashes must still be contained by quotes
"1234567890123456789012345678901234567890123456789012345678901234+x@example.com", # local part is longer than 64 characters
# Emails with empty local part
"@example.com", # only valid if allow_empty_local is True
# Emails with non-ASCII characters
"üñîçøðé@example.com", # only valid if allow_smtputf8 is True
"user@üñîçøðé.com", # only valid if allow_smtputf8 is True
# Emails with quoted local part
'"john.doe"@example.com', # only valid if allow_quoted_local is True
'"john..doe"@example.com', # only valid if allow_quoted_local is True
# Emails with display name
"John Doe <john@example.com>", # only valid if allow_display_name is True
# Emails with domain literal
"user@[192.0.2.1]", # only valid if allow_domain_literal is True
# Emails with long local part
"a" * 65 + "@example.com", # local part is longer than 64 characters
# Emails with invalid characters
"john doe@example.com", # space is not allowed
"john@doe@example.com", # only one @ is allowed
"john.doe@.com", # domain can't start with a dot
"john.doe@example..com", # domain can't have two consecutive dots
"test@google.com",
]
# create a list of configurations
configurations = [
{
"check_deliverability": True,
"test_environment": False,
"allow_smtputf8": False,
"allow_empty_local": False,
"allow_quoted_local": False,
"allow_display_name": False,
"allow_domain_literal": False,
"globally_deliverable": None,
"timeout": 10,
"dns_type": "timeout",
},
{
"check_deliverability": False,
"test_environment": True,
"allow_smtputf8": True,
"allow_empty_local": True,
"allow_quoted_local": True,
"allow_display_name": True,
"allow_domain_literal": True,
"globally_deliverable": None,
"timeout": 5,
"dns_type": "dns",
},
{"check_deliverability": True},
{
"check_deliverability": False,
"test_environment": False,
"allow_smtputf8": True,
"allow_empty_local": False,
"allow_quoted_local": True,
"allow_display_name": False,
"allow_domain_literal": True,
"globally_deliverable": None,
"timeout": 15,
"dns_type": "timeout",
},
{
"check_deliverability": True,
"test_environment": True,
"allow_smtputf8": False,
"allow_empty_local": True,
"allow_quoted_local": False,
"allow_display_name": True,
"allow_domain_literal": False,
"globally_deliverable": None,
"timeout": 20,
"dns_type": "dns",
},
{
"check_deliverability": False,
"test_environment": False,
"allow_smtputf8": True,
"allow_empty_local": True,
"allow_quoted_local": True,
"allow_display_name": True,
"allow_domain_literal": True,
"globally_deliverable": None,
"timeout": 25,
"dns_type": "timeout",
},
{
"check_deliverability": True,
"test_environment": True,
"allow_smtputf8": False,
"allow_empty_local": False,
"allow_quoted_local": False,
"allow_display_name": False,
"allow_domain_literal": False,
"globally_deliverable": None,
"timeout": 30,
"dns_type": "dns",
},
{
"check_deliverability": False,
"test_environment": True,
"allow_smtputf8": True,
"allow_empty_local": False,
"allow_quoted_local": True,
"allow_display_name": True,
"allow_domain_literal": False,
"globally_deliverable": None,
"timeout": 35,
"dns_type": "timeout",
},
{
"check_deliverability": True,
"test_environment": False,
"allow_smtputf8": False,
"allow_empty_local": True,
"allow_quoted_local": True,
"allow_display_name": False,
"allow_domain_literal": True,
"globally_deliverable": None,
"timeout": 40,
"dns_type": "dns",
},
{
"check_deliverability": False,
"test_environment": True,
"allow_smtputf8": True,
"allow_empty_local": False,
"allow_quoted_local": False,
"allow_display_name": True,
"allow_domain_literal": True,
"globally_deliverable": None,
"timeout": 45,
"dns_type": "timeout",
},
]
t0 = time.time()
validity = []
for email in email_addresses:
for config in configurations:
res = validate_email_address(email, **config)
validity.append(res)
t1 = time.time()
validity = sorted(validity, key=lambda x: x["email"])
for v in validity:
pprint.pprint(v, indent=4)
print(f"Time taken: {t1 - t0:.2f}")
Ended: Recipes
About ↵
About¶
DevSetGo Library is a collection of tools and utilities to help developers with their projects. It has been primarily developed to help with the development of the DevSetGo.com website and other projects in my personal and professional life. It is a work in progress and will be updated as needed.
The driving force behind the library is to limit the amount of boilerplate code that I (Mike Ryan) have to write for each project. Copying code from one project to another causes issues to get fixed in one project, but not always get updated in others. This library is an attempt to fix that issue and make it easier to maintain code across multiple projects.
The library is written in Python and is available on PyPi. It is open source and available on GitHub. Feel free to use it in your projects and contribute to the library.
About Me¶
I am a software engineering manager with an eclectic background in various industries (finance, manufacturing, and metrology). I am passionate about software development and love to learn new things.
Contributing¶
Please feel to contribute to this project. Adding common functions is the intent and if you have one to add or improve an existing it is greatly appreciated.
Ways to Contribute!¶
- Add or improve a function
- Add or improve documentation
- Add or improve Tests
- Report or fix a bug
Ended: About
Changelog¶
All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog
Latest Changes¶
Adding new db functions (v2024.11.28.1)¶
What's Changed¶
- Adding new general execute queries and adding deprecation (#459) @devsetgo
- pip(deps): bump tox from 4.23.0 to 4.23.2 (#455) @dependabot
- pip(deps): bump fastapi[all] from 0.115.2 to 0.115.4 (#454) @dependabot
- pip(deps): bump tqdm from 4.66.5 to 4.66.6 (#456) @dependabot
- pip(deps): bump pymdown-extensions from 10.11.2 to 10.12 (#457) @dependabot
- pip(deps): bump ruff from 0.7.0 to 0.7.1 (#458) @dependabot
Published Date: 2024 November 28, 22:01
Moving to Calendar Versioning (2024.10.20.1)¶
What's Changed¶
- moving to calendar versioning (#453) @devsetgo
- pip(deps): bump tox from 4.21.0 to 4.23.0 (#452) @dependabot
- pip(deps): bump fastapi[all] from 0.114.2 to 0.115.0 (#451) @dependabot
- pip(deps): bump tox from 4.18.1 to 4.21.0 (#450) @dependabot
- pip(deps): bump watchdog from 5.0.2 to 5.0.3 (#449) @dependabot
- pip(deps): bump pylint from 3.2.7 to 3.3.1 (#448) @dependabot
- pip(deps): bump ruff from 0.6.5 to 0.6.8 (#447) @dependabot
Published Date: 2024 October 20, 16:30
Complete Replacement of CX-Oracle for OracleDB (v0.14.4)¶
What's Changed¶
- Remove CX-Oracle for OracleDB cleanup (#446) @devsetgo
- pip(deps): bump pylint from 3.2.6 to 3.2.7 (#442) @dependabot
- pip(deps): bump mkdocs-material from 9.5.33 to 9.5.34 (#443) @dependabot
- github actionts(deps): bump actions/checkout from 2 to 4 (#444) @dependabot
- github actionts(deps): bump actions/setup-python from 2 to 5 (#445) @dependabot
Published Date: 2024 September 15, 15:28
Standard Logging Suppression by Default (v0.14.3)¶
What's Changed¶
- Limit Standard Logging being Displayed (#441) @devsetgo
Published Date: 2024 August 31, 17:33
Improvements and fixes (v0.14.2)¶
What's Changed¶
- Improvements and fixes (#440) @devsetgo
Breaking changes¶
- save_text function no longer adds .txt by default.
- Change from cx-oracle to oracledb
- Improvements to documentation
Published Date: 2024 August 31, 00:02
Adding DB Disconnect (v0.14.1)¶
What's Changed¶
- Adding Database Disconnect (#439) @devsetgo
- pip(deps): bump pre-commit from 3.7.1 to 3.8.0 (#434) @dependabot
- updates to deal with stashing pages (#437) @devsetgo
- working on issue for deployment (#436) @devsetgo
- Adding MKDocs Workflow (#435) @devsetgo
- Version 0.14.0 (#433) @devsetgo
Published Date: 2024 August 25, 18:47
Fix of version for Pypi (v0.14.0-a)¶
What's Changed¶
- Version 0.14.0 (#433) @devsetgo
Published Date: 2024 July 27, 22:40
High Speed Multi-Processing Improvements (v0.14.0)¶
What's Changed¶
- High Speed Logging for Loguru Multi-Processing (#432) @devsetgo
- Resilient Sink Fixes (#431) @devsetgo
- Fix of bug in resilient sink (#430) @devsetgo
- Adding Resiliency to Logging Config (#429) @devsetgo
- pip(deps): bump mkdocs-print-site-plugin from 2.4.1 to 2.5.0 (#422) @dependabot
- pip(deps): bump ruff from 0.4.5 to 0.4.7 (#420) @dependabot
- pip(deps): bump autopep8 from 2.1.1 to 2.2.0 (#421) @dependabot
- pip(deps): bump mkdocs-material from 9.5.24 to 9.5.25 (#423) @dependabot
Published Date: 2024 July 27, 22:28
(v0.13.0-republish)¶
What's Changed¶
Republishing v0.13.0 for pypi.
Published Date: 2024 May 26, 17:13
(v0.13.0)¶
What's Changed¶
- Breaking Change: Removing Limit and Offset from read queries (#419) @devsetgo
Published Date: 2024 May 26, 15:44
Adding missing requirement (v0.12.4)¶
What's Changed¶
- adding missing requirement (#417) @devsetgo
Published Date: 2024 May 16, 14:40
Adding Email Validation (v0.12.3)¶
What's Changed¶
- bump to 0.12.3 (#416) @devsetgo
- Add email validation capabilities (#415) @devsetgo
- pip(deps): bump mkdocs-material from 9.5.20 to 9.5.21 (#414) @dependabot
- pip(deps): bump ruff from 0.4.2 to 0.4.4 (#413) @dependabot
- pip(deps): bump coverage-badge from 1.1.0 to 1.1.1 (#409) @dependabot
- pip(deps): bump mkdocs-material from 9.5.18 to 9.5.20 (#408) @dependabot
- pip(deps): bump mkdocstrings[python,shell] from 0.24.3 to 0.25.0 (#407) @dependabot
- pip(deps): bump ruff from 0.4.1 to 0.4.2 (#410) @dependabot
- pip(deps): bump tqdm from 4.66.2 to 4.66.3 (#412) @dependabot
Published Date: 2024 May 16, 14:19
logging changes (v0.12.2)¶
What's Changed¶
- Logging Changes, Documentation Updates, Using Ruff (#406) @devsetgo
Published Date: 2024 April 22, 16:17
Updates for MetaData and All HTTP Codes (v0.12.1)¶
What's Changed¶
- Enhancements and fixes (#405) @devsetgo
Published Date: 2024 April 19, 18:52
Breaking Change: Base Schema per Database Type (v0.12.0)¶
What's Changed¶
- Adding new base schema for database types (#402) @devsetgo
- creating main release for 0.11.2 (#390) @devsetgo
- Working on bug in read_query (#389) @devsetgo
- Reorganizing Library Structure (#388) @devsetgo
- Python Build and Publish fix (#382) @devsetgo
¶
Breaking Change - SchemaBase is now SchemaBaseSQLite
Published Date: 2024 April 13, 22:55
Reorganizing Library Stucture (v0.11.2-main)¶
What's Changed¶
- creating main release for 0.11.2 (#390) @devsetgo
- Working on bug in read_query (#389) @devsetgo
- Reorganizing Library Structure (#388) @devsetgo
- Python Build and Publish fix (#382) @devsetgo
Breaking Changes are included in this release for import paths. See documents for more information.
Published Date: 2024 February 17, 19:09
Read Query Fix Beta Testing (v0.11.2-beta1)¶
What's Changed¶
- Working on bug in read_query (#389) @devsetgo
- Reorganizing Library Structure (#388) @devsetgo
- Python Build and Publish fix (#382) @devsetgo
Published Date: 2024 February 16, 22:01
Pre-Release to test new structure and publishing (v0.11.2-beta)¶
What's Changed¶
- Reorganizing Library Structure (#388) @devsetgo
- Python Build and Publish fix (#382) @devsetgo
Published Date: 2024 February 10, 21:16
(v0.11.2-fix2)¶
What's Changed¶
- Python Build and Publish fix (#382) @devsetgo
Published Date: 2024 January 21, 15:01
Adding Delete Many and minor fixes (v0.11.2)¶
What's Changed¶
- Adding Delete Many and Other Updates (#381) @devsetgo
- pip(deps): bump mkdocs-material from 9.5.2 to 9.5.3 (#377) @dependabot
- pip(deps): bump fastapi[all] from 0.105.0 to 0.108.0 (#375) @dependabot
- pip(deps): bump sqlalchemy from 2.0.23 to 2.0.24 (#374) @dependabot
- pip(deps): bump pytest from 7.4.3 to 7.4.4 (#373) @dependabot
- pip(deps): bump black from 23.12.0 to 23.12.1 (#376) @dependabot
- github actionts(deps): bump actions/setup-python from 4 to 5 (#378) @dependabot
Published Date: 2024 January 20, 00:07
Breaking Change (v0.11.1)¶
What's Changed¶
- Bump of Version to 0.11.1 (#371) @devsetgo
- Query Improvement (#370) @devsetgo
- 368 get one record should return an empty value when called (#369) @devsetgo
- updating docs from v0.11.0 release (#367) @devsetgo
Published Date: 2023 December 23, 10:49
Full Release of New Features (v0.11.0)¶
What's Changed¶
- Prep for Release (#366) @devsetgo
- Fixing sonar settings (#365) @devsetgo
- Fixes and improvements (#364) @devsetgo
- Dev (#362) @devsetgo
- Fix of issues from Beta release (#361) @devsetgo
- 359 tables are created before create tables is called (#360) @devsetgo
- Change Log (#358) @devsetgo
- fixing latest-changes (#357) @devsetgo
- removing jinja template from Latest Changes Action (#356) @devsetgo
- Action fixing adding main (#355) @devsetgo
- Fixing actions (#354) @devsetgo
- Fixing Beta Publishing issues and Documentation Improvements (#353) @devsetgo
- Update setup.py for sub packages (#352) @devsetgo
- Import Bug Fix (#351) @devsetgo
- Latest Changes Action Fix (#350) @devsetgo
- Next Release (#349) @devsetgo
- Dev (#348) @devsetgo
- pip(deps): bump autopep8 from 2.0.2 to 2.0.4 (#343) @dependabot
- pip(deps): bump wheel from 0.41.2 to 0.42.0 (#345) @dependabot
- pip(deps): bump mkdocstrings[python] from 0.21.2 to 0.24.0 (#346) @dependabot
- pip(deps): bump mkdocs from 1.4.3 to 1.5.3 (#347) @dependabot
- pip(deps): bump flake8 from 6.0.0 to 6.1.0 (#332) @dependabot
- pip(deps): bump click from 8.1.3 to 8.1.7 (#337) @dependabot
- pip(deps): bump wheel from 0.40.0 to 0.41.2 (#339) @dependabot
- github actionts(deps): bump actions/checkout from 2 to 4 (#340) @dependabot
- pip(deps): bump mkdocs-material from 9.1.17 to 9.4.2 (#341) @dependabot
- pip(deps): bump black from 23.3.0 to 23.9.1 (#342) @dependabot
- pip(deps): bump mkdocs-material from 9.1.15 to 9.1.17 (#326) @dependabot
- pip(deps): bump pytest from 7.3.1 to 7.4.0 (#327) @dependabot
- pip(deps): bump mkdocs from 1.4.2 to 1.4.3 (#328) @dependabot
- pip(deps): bump autoflake from 2.1.1 to 2.2.0 (#329) @dependabot
- pip(deps): bump pre-commit from 3.2.2 to 3.3.3 (#330) @dependabot
- pip(deps): bump mkdocs-material from 9.1.9 to 9.1.15 (#325) @dependabot
- pip(deps): bump autoflake from 2.0.2 to 2.1.1 (#324) @dependabot
- pip(deps): bump pytest-xdist from 3.2.1 to 3.3.1 (#323) @dependabot
- pip(deps): bump tox from 4.4.11 to 4.5.2 (#322) @dependabot
- pip(deps): bump pytest-cov from 4.0.0 to 4.1.0 (#321) @dependabot
- pip(deps): bump loguru from 0.6.0 to 0.7.0 (#317) @dependabot
- pip(deps): bump mkdocs-gen-files from 0.4.0 to 0.5.0 (#314) @dependabot
- pip(deps): bump pylint from 2.17.2 to 2.17.4 (#319) @dependabot
- pip(deps): bump mkdocs-material from 9.1.6 to 9.1.9 (#320) @dependabot
- pip(deps): bump pytest from 7.3.0 to 7.3.1 (#318) @dependabot
Published Date: 2023 December 17, 22:00
Beta Release with fixes for multiple issues (v0.11.0-beta3-fix1)¶
What's Changed¶
- Dev (#362) @devsetgo
- Fix of issues from Beta release (#361) @devsetgo
- 359 tables are created before create tables is called (#360) @devsetgo
- Change Log (#358) @devsetgo
- fixing latest-changes (#357) @devsetgo
- removing jinja template from Latest Changes Action (#356) @devsetgo
- Action fixing adding main (#355) @devsetgo
- Fixing actions (#354) @devsetgo
- Fixing Beta Publishing issues and Documentation Improvements (#353) @devsetgo
- Update setup.py for sub packages (#352) @devsetgo
- Import Bug Fix (#351) @devsetgo
- Latest Changes Action Fix (#350) @devsetgo
- Next Release (#349) @devsetgo
- Dev (#348) @devsetgo
- pip(deps): bump autopep8 from 2.0.2 to 2.0.4 (#343) @dependabot
- pip(deps): bump wheel from 0.41.2 to 0.42.0 (#345) @dependabot
- pip(deps): bump mkdocstrings[python] from 0.21.2 to 0.24.0 (#346) @dependabot
- pip(deps): bump mkdocs from 1.4.3 to 1.5.3 (#347) @dependabot
- pip(deps): bump flake8 from 6.0.0 to 6.1.0 (#332) @dependabot
- pip(deps): bump click from 8.1.3 to 8.1.7 (#337) @dependabot
- pip(deps): bump wheel from 0.40.0 to 0.41.2 (#339) @dependabot
- github actionts(deps): bump actions/checkout from 2 to 4 (#340) @dependabot
- pip(deps): bump mkdocs-material from 9.1.17 to 9.4.2 (#341) @dependabot
- pip(deps): bump black from 23.3.0 to 23.9.1 (#342) @dependabot
- pip(deps): bump mkdocs-material from 9.1.15 to 9.1.17 (#326) @dependabot
- pip(deps): bump pytest from 7.3.1 to 7.4.0 (#327) @dependabot
- pip(deps): bump mkdocs from 1.4.2 to 1.4.3 (#328) @dependabot
- pip(deps): bump autoflake from 2.1.1 to 2.2.0 (#329) @dependabot
- pip(deps): bump pre-commit from 3.2.2 to 3.3.3 (#330) @dependabot
- pip(deps): bump mkdocs-material from 9.1.9 to 9.1.15 (#325) @dependabot
- pip(deps): bump autoflake from 2.0.2 to 2.1.1 (#324) @dependabot
- pip(deps): bump pytest-xdist from 3.2.1 to 3.3.1 (#323) @dependabot
- pip(deps): bump tox from 4.4.11 to 4.5.2 (#322) @dependabot
- pip(deps): bump pytest-cov from 4.0.0 to 4.1.0 (#321) @dependabot
- pip(deps): bump loguru from 0.6.0 to 0.7.0 (#317) @dependabot
- pip(deps): bump mkdocs-gen-files from 0.4.0 to 0.5.0 (#314) @dependabot
- pip(deps): bump pylint from 2.17.2 to 2.17.4 (#319) @dependabot
- pip(deps): bump mkdocs-material from 9.1.6 to 9.1.9 (#320) @dependabot
- pip(deps): bump pytest from 7.3.0 to 7.3.1 (#318) @dependabot
Published Date: 2023 December 17, 16:23
Fixing AsyncDatabase create tables (v0.11.0-beta3)¶
What's Changed¶
- Fix of issues from Beta release (#361) @devsetgo
- 359 tables are created before create tables is called (#360) @devsetgo
- Change Log (#358) @devsetgo
- fixing latest-changes (#357) @devsetgo
- removing jinja template from Latest Changes Action (#356) @devsetgo
- Action fixing adding main (#355) @devsetgo
- Fixing actions (#354) @devsetgo
- Fixing Beta Publishing issues and Documentation Improvements (#353) @devsetgo
- Update setup.py for sub packages (#352) @devsetgo
- Import Bug Fix (#351) @devsetgo
- Latest Changes Action Fix (#350) @devsetgo
- Next Release (#349) @devsetgo
- Dev (#348) @devsetgo
- pip(deps): bump autopep8 from 2.0.2 to 2.0.4 (#343) @dependabot
- pip(deps): bump wheel from 0.41.2 to 0.42.0 (#345) @dependabot
- pip(deps): bump mkdocstrings[python] from 0.21.2 to 0.24.0 (#346) @dependabot
- pip(deps): bump mkdocs from 1.4.3 to 1.5.3 (#347) @dependabot
- pip(deps): bump flake8 from 6.0.0 to 6.1.0 (#332) @dependabot
- pip(deps): bump click from 8.1.3 to 8.1.7 (#337) @dependabot
- pip(deps): bump wheel from 0.40.0 to 0.41.2 (#339) @dependabot
- github actionts(deps): bump actions/checkout from 2 to 4 (#340) @dependabot
- pip(deps): bump mkdocs-material from 9.1.17 to 9.4.2 (#341) @dependabot
- pip(deps): bump black from 23.3.0 to 23.9.1 (#342) @dependabot
- pip(deps): bump mkdocs-material from 9.1.15 to 9.1.17 (#326) @dependabot
- pip(deps): bump pytest from 7.3.1 to 7.4.0 (#327) @dependabot
- pip(deps): bump mkdocs from 1.4.2 to 1.4.3 (#328) @dependabot
- pip(deps): bump autoflake from 2.1.1 to 2.2.0 (#329) @dependabot
- pip(deps): bump pre-commit from 3.2.2 to 3.3.3 (#330) @dependabot
- pip(deps): bump mkdocs-material from 9.1.9 to 9.1.15 (#325) @dependabot
- pip(deps): bump autoflake from 2.0.2 to 2.1.1 (#324) @dependabot
- pip(deps): bump pytest-xdist from 3.2.1 to 3.3.1 (#323) @dependabot
- pip(deps): bump tox from 4.4.11 to 4.5.2 (#322) @dependabot
- pip(deps): bump pytest-cov from 4.0.0 to 4.1.0 (#321) @dependabot
- pip(deps): bump loguru from 0.6.0 to 0.7.0 (#317) @dependabot
- pip(deps): bump mkdocs-gen-files from 0.4.0 to 0.5.0 (#314) @dependabot
- pip(deps): bump pylint from 2.17.2 to 2.17.4 (#319) @dependabot
- pip(deps): bump mkdocs-material from 9.1.6 to 9.1.9 (#320) @dependabot
- pip(deps): bump pytest from 7.3.0 to 7.3.1 (#318) @dependabot
Published Date: 2023 December 17, 16:18
Build Updates (v0.11.0-beta2)¶
What's Changed¶
- Change Log (#358) @devsetgo
- fixing latest-changes (#357) @devsetgo
- removing jinja template from Latest Changes Action (#356) @devsetgo
- Action fixing adding main (#355) @devsetgo
- Fixing actions (#354) @devsetgo
- Fixing Beta Publishing issues and Documentation Improvements (#353) @devsetgo
- Update setup.py for sub packages (#352) @devsetgo
- Import Bug Fix (#351) @devsetgo
- Latest Changes Action Fix (#350) @devsetgo
- Next Release (#349) @devsetgo
- Dev (#348) @devsetgo
- pip(deps): bump autopep8 from 2.0.2 to 2.0.4 (#343) @dependabot
- pip(deps): bump wheel from 0.41.2 to 0.42.0 (#345) @dependabot
- pip(deps): bump mkdocstrings[python] from 0.21.2 to 0.24.0 (#346) @dependabot
- pip(deps): bump mkdocs from 1.4.3 to 1.5.3 (#347) @dependabot
- pip(deps): bump flake8 from 6.0.0 to 6.1.0 (#332) @dependabot
- pip(deps): bump click from 8.1.3 to 8.1.7 (#337) @dependabot
- pip(deps): bump wheel from 0.40.0 to 0.41.2 (#339) @dependabot
- github actionts(deps): bump actions/checkout from 2 to 4 (#340) @dependabot
- pip(deps): bump mkdocs-material from 9.1.17 to 9.4.2 (#341) @dependabot
- pip(deps): bump black from 23.3.0 to 23.9.1 (#342) @dependabot
- pip(deps): bump mkdocs-material from 9.1.15 to 9.1.17 (#326) @dependabot
- pip(deps): bump pytest from 7.3.1 to 7.4.0 (#327) @dependabot
- pip(deps): bump mkdocs from 1.4.2 to 1.4.3 (#328) @dependabot
- pip(deps): bump autoflake from 2.1.1 to 2.2.0 (#329) @dependabot
- pip(deps): bump pre-commit from 3.2.2 to 3.3.3 (#330) @dependabot
- pip(deps): bump mkdocs-material from 9.1.9 to 9.1.15 (#325) @dependabot
- pip(deps): bump autoflake from 2.0.2 to 2.1.1 (#324) @dependabot
- pip(deps): bump pytest-xdist from 3.2.1 to 3.3.1 (#323) @dependabot
- pip(deps): bump tox from 4.4.11 to 4.5.2 (#322) @dependabot
- pip(deps): bump pytest-cov from 4.0.0 to 4.1.0 (#321) @dependabot
- pip(deps): bump loguru from 0.6.0 to 0.7.0 (#317) @dependabot
- pip(deps): bump mkdocs-gen-files from 0.4.0 to 0.5.0 (#314) @dependabot
- pip(deps): bump pylint from 2.17.2 to 2.17.4 (#319) @dependabot
- pip(deps): bump mkdocs-material from 9.1.6 to 9.1.9 (#320) @dependabot
- pip(deps): bump pytest from 7.3.0 to 7.3.1 (#318) @dependabot
Published Date: 2023 December 16, 20:34
Beta Release with fixes for multiple issues (v0.11.0-beta1-fix5)¶
What's Changed¶
- fixing latest-changes (#357) @devsetgo
- removing jinja template from Latest Changes Action (#356) @devsetgo
- Action fixing adding main (#355) @devsetgo
- Fixing actions (#354) @devsetgo
- Fixing Beta Publishing issues and Documentation Improvements (#353) @devsetgo
- Update setup.py for sub packages (#352) @devsetgo
- Import Bug Fix (#351) @devsetgo
- Latest Changes Action Fix (#350) @devsetgo
- Next Release (#349) @devsetgo
- Dev (#348) @devsetgo
- pip(deps): bump autopep8 from 2.0.2 to 2.0.4 (#343) @dependabot
- pip(deps): bump wheel from 0.41.2 to 0.42.0 (#345) @dependabot
- pip(deps): bump mkdocstrings[python] from 0.21.2 to 0.24.0 (#346) @dependabot
- pip(deps): bump mkdocs from 1.4.3 to 1.5.3 (#347) @dependabot
- pip(deps): bump flake8 from 6.0.0 to 6.1.0 (#332) @dependabot
- pip(deps): bump click from 8.1.3 to 8.1.7 (#337) @dependabot
- pip(deps): bump wheel from 0.40.0 to 0.41.2 (#339) @dependabot
- github actionts(deps): bump actions/checkout from 2 to 4 (#340) @dependabot
- pip(deps): bump mkdocs-material from 9.1.17 to 9.4.2 (#341) @dependabot
- pip(deps): bump black from 23.3.0 to 23.9.1 (#342) @dependabot
- pip(deps): bump mkdocs-material from 9.1.15 to 9.1.17 (#326) @dependabot
- pip(deps): bump pytest from 7.3.1 to 7.4.0 (#327) @dependabot
- pip(deps): bump mkdocs from 1.4.2 to 1.4.3 (#328) @dependabot
- pip(deps): bump autoflake from 2.1.1 to 2.2.0 (#329) @dependabot
- pip(deps): bump pre-commit from 3.2.2 to 3.3.3 (#330) @dependabot
- pip(deps): bump mkdocs-material from 9.1.9 to 9.1.15 (#325) @dependabot
- pip(deps): bump autoflake from 2.0.2 to 2.1.1 (#324) @dependabot
- pip(deps): bump pytest-xdist from 3.2.1 to 3.3.1 (#323) @dependabot
- pip(deps): bump tox from 4.4.11 to 4.5.2 (#322) @dependabot
- pip(deps): bump pytest-cov from 4.0.0 to 4.1.0 (#321) @dependabot
- pip(deps): bump loguru from 0.6.0 to 0.7.0 (#317) @dependabot
- pip(deps): bump mkdocs-gen-files from 0.4.0 to 0.5.0 (#314) @dependabot
- pip(deps): bump pylint from 2.17.2 to 2.17.4 (#319) @dependabot
- pip(deps): bump mkdocs-material from 9.1.6 to 9.1.9 (#320) @dependabot
- pip(deps): bump pytest from 7.3.0 to 7.3.1 (#318) @dependabot
Published Date: 2023 December 16, 16:33
Build Fixes (v0.11.0-beta1-fix4)¶
What's Changed¶
- Update setup.py for sub packages (#352) @devsetgo
- Import Bug Fix (#351) @devsetgo
- Latest Changes Action Fix (#350) @devsetgo
- Next Release (#349) @devsetgo
- Dev (#348) @devsetgo
- pip(deps): bump autopep8 from 2.0.2 to 2.0.4 (#343) @dependabot
- pip(deps): bump wheel from 0.41.2 to 0.42.0 (#345) @dependabot
- pip(deps): bump mkdocstrings[python] from 0.21.2 to 0.24.0 (#346) @dependabot
- pip(deps): bump mkdocs from 1.4.3 to 1.5.3 (#347) @dependabot
- pip(deps): bump flake8 from 6.0.0 to 6.1.0 (#332) @dependabot
- pip(deps): bump click from 8.1.3 to 8.1.7 (#337) @dependabot
- pip(deps): bump wheel from 0.40.0 to 0.41.2 (#339) @dependabot
- github actionts(deps): bump actions/checkout from 2 to 4 (#340) @dependabot
- pip(deps): bump mkdocs-material from 9.1.17 to 9.4.2 (#341) @dependabot
- pip(deps): bump black from 23.3.0 to 23.9.1 (#342) @dependabot
- pip(deps): bump mkdocs-material from 9.1.15 to 9.1.17 (#326) @dependabot
- pip(deps): bump pytest from 7.3.1 to 7.4.0 (#327) @dependabot
- pip(deps): bump mkdocs from 1.4.2 to 1.4.3 (#328) @dependabot
- pip(deps): bump autoflake from 2.1.1 to 2.2.0 (#329) @dependabot
- pip(deps): bump pre-commit from 3.2.2 to 3.3.3 (#330) @dependabot
- pip(deps): bump mkdocs-material from 9.1.9 to 9.1.15 (#325) @dependabot
- pip(deps): bump autoflake from 2.0.2 to 2.1.1 (#324) @dependabot
- pip(deps): bump pytest-xdist from 3.2.1 to 3.3.1 (#323) @dependabot
- pip(deps): bump tox from 4.4.11 to 4.5.2 (#322) @dependabot
- pip(deps): bump pytest-cov from 4.0.0 to 4.1.0 (#321) @dependabot
- pip(deps): bump loguru from 0.6.0 to 0.7.0 (#317) @dependabot
- pip(deps): bump mkdocs-gen-files from 0.4.0 to 0.5.0 (#314) @dependabot
- pip(deps): bump pylint from 2.17.2 to 2.17.4 (#319) @dependabot
- pip(deps): bump mkdocs-material from 9.1.6 to 9.1.9 (#320) @dependabot
- pip(deps): bump pytest from 7.3.0 to 7.3.1 (#318) @dependabot
Published Date: 2023 December 12, 11:45
Async Database and FastAPI functions (v0.11.0-beta0)¶
What's Changed¶
- Dev (#348) @devsetgo - New functionality and documentation for FastAPI Endpoints and Async Database Functionality
- pip(deps): bump autopep8 from 2.0.2 to 2.0.4 (#343) @dependabot
- pip(deps): bump wheel from 0.41.2 to 0.42.0 (#345) @dependabot
- pip(deps): bump mkdocstrings[python] from 0.21.2 to 0.24.0 (#346) @dependabot
- pip(deps): bump mkdocs from 1.4.3 to 1.5.3 (#347) @dependabot
- pip(deps): bump flake8 from 6.0.0 to 6.1.0 (#332) @dependabot
- pip(deps): bump click from 8.1.3 to 8.1.7 (#337) @dependabot
- pip(deps): bump wheel from 0.40.0 to 0.41.2 (#339) @dependabot
- github actionts(deps): bump actions/checkout from 2 to 4 (#340) @dependabot
- pip(deps): bump mkdocs-material from 9.1.17 to 9.4.2 (#341) @dependabot
- pip(deps): bump black from 23.3.0 to 23.9.1 (#342) @dependabot
- pip(deps): bump mkdocs-material from 9.1.15 to 9.1.17 (#326) @dependabot
- pip(deps): bump pytest from 7.3.1 to 7.4.0 (#327) @dependabot
- pip(deps): bump mkdocs from 1.4.2 to 1.4.3 (#328) @dependabot
- pip(deps): bump autoflake from 2.1.1 to 2.2.0 (#329) @dependabot
- pip(deps): bump pre-commit from 3.2.2 to 3.3.3 (#330) @dependabot
- pip(deps): bump mkdocs-material from 9.1.9 to 9.1.15 (#325) @dependabot
- pip(deps): bump autoflake from 2.0.2 to 2.1.1 (#324) @dependabot
- pip(deps): bump pytest-xdist from 3.2.1 to 3.3.1 (#323) @dependabot
- pip(deps): bump tox from 4.4.11 to 4.5.2 (#322) @dependabot
- pip(deps): bump pytest-cov from 4.0.0 to 4.1.0 (#321) @dependabot
- pip(deps): bump loguru from 0.6.0 to 0.7.0 (#317) @dependabot
- pip(deps): bump mkdocs-gen-files from 0.4.0 to 0.5.0 (#314) @dependabot
- pip(deps): bump pylint from 2.17.2 to 2.17.4 (#319) @dependabot
- pip(deps): bump mkdocs-material from 9.1.6 to 9.1.9 (#320) @dependabot
- pip(deps): bump pytest from 7.3.0 to 7.3.1 (#318) @dependabot
Published Date: 2023 December 10, 20:17
Pattern Analysis Update and Bug Fix (v0.10.1)¶
What's Changed¶
- Improvement to the patterns analysis (#313) @devsetgo
- pip(deps): bump mkdocs-material from 9.1.3 to 9.1.5 (#308) @dependabot
- pip(deps): bump pre-commit from 3.2.0 to 3.2.1 (#310) @dependabot
- pip(deps): bump watchdog from 2.3.1 to 3.0.0 (#309) @dependabot
- pip(deps): bump pylint from 2.17.0 to 2.17.1 (#311) @dependabot
- pip(deps): bump tox from 4.4.7 to 4.4.8 (#312) @dependabot
Published Date: 2023 April 08, 21:45
ChatGPT Driven Improvements (v0.10.0)¶
ChatGPT¶
Using ChatGPT to improve tests, find bugs, and improve performance. Code coverage is at 100% and the code base appears to be performing better than before.
Major changes are in PR #304
What's Changed¶
- latest change fix for regex pattern. (#307) @devsetgo
- Dev (#306) @devsetgo
- Workflow changes (#305) @devsetgo
- ChatGPT Driven Improvements (#304) @devsetgo
- pip(deps): bump pre-commit from 3.0.2 to 3.1.1 (#300) @dependabot
- pip(deps): bump pytest-xdist from 3.1.0 to 3.2.0 (#302) @dependabot
- pip(deps): bump autoflake from 2.0.0 to 2.0.1 (#299) @dependabot
- pip(deps): bump watchdog from 2.1.9 to 2.3.1 (#301) @dependabot
- pip(deps): bump pytest from 7.2.0 to 7.2.1 (#303) @dependabot
- pip(deps): bump pylint from 2.15.7 to 2.16.1 (#298) @dependabot
- pip(deps): bump autopep8 from 2.0.0 to 2.0.1 (#289) @dependabot
- pip(deps): bump pylint from 2.15.7 to 2.15.10 (#295) @dependabot
- pip(deps): bump black from 22.10.0 to 23.1.0 (#294) @dependabot
- pip(deps): bump tox from 3.27.1 to 4.4.4 (#296) @dependabot
- pip(deps): bump pre-commit from 2.20.0 to 3.0.2 (#297) @dependabot
Published Date: 2023 April 01, 00:27