Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Sign up
Appearance settings

Feature Request: Add a built-in log retention policy for archived logs #3965

smanenti started this conversation in Ideas
Discussion options

Hello,

First, thank you for the great work on SeleniumBase. It's a very powerful and useful tool for web automation and testing.

I regularly use the --archive-logs feature to save logs from failed test runs for later analysis. This is an essential feature for debugging.
However, over time, the archived_logs/ directory can grow very large and consume significant disk space.
Currently, managing this directory requires a separate, custom script to periodically clean up old logs, which is run outside of SeleniumBase.

It would be very convenient to have a built-in mechanism to automatically manage the retention of these archived logs. I propose adding a new option to control this behavior, for instance a new command-line argument: --log-retention-days

When this new setting is configured (e.g., pytest --archive-logs --log-retention-days 30), SeleniumBase would automatically delete any log folders inside the archived_logs/ directory that are older than the specified number of days.

I can achieve this with a hook in pytest, or by making my own seleniumbase teardown, but I'm wondering if this could be part of SB directly.

Thanks!

You must be logged in to vote

Replies: 1 comment 1 reply

Comment options

Hello, thank you.

I think it might be better as a separate pytest hook, or as a cron job that runs daily with the following Python script:

import os
import shutil
from datetime import datetime, timedelta
def delete_old_folders(parent_folder, days_old=30):
 """
 Deletes folders within a specified parent folder that are older than a given number of days.

 Args:
 parent_folder (str): The path to the parent folder to scan.
 days_old (int): The number of days after which a folder is considered old.
 """
 if not os.path.isdir(parent_folder):
 print(f"Error: Parent folder '{parent_folder}' does not exist.")
 return
 cutoff_date = datetime.now() - timedelta(days=days_old)
 for item in os.listdir(parent_folder):
 item_path = os.path.join(parent_folder, item)
 if os.path.isdir(item_path):
 try:
 # Get the last modification time of the folder
 mod_timestamp = os.path.getmtime(item_path)
 mod_datetime = datetime.fromtimestamp(mod_timestamp)
 if mod_datetime < cutoff_date:
 print(f"Deleting old folder: {item_path}")
 shutil.rmtree(item_path) # Recursively deletes the folder and its contents
 else:
 print(f"Keeping folder (not old enough): {item_path}")
 except OSError as e:
 print(f"Error processing folder {item_path}: {e}")
# Example usage:
if __name__ == "__main__":
 target_directory = "/path/to/your/parent_folder" # Replace with the actual path
 delete_old_folders(target_directory, days_old=30)

(Just replace the path with the path to the archived_logs folder.)

For safety, I would also include a check to make sure that the folder name starts with logs_ before deleting it, just in case somehow the wrong folder got selected during the cleanup process.

You must be logged in to vote
1 reply
Comment options

Thanks for your answer, we will implement it on our side then.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Ideas
Labels
None yet
2 participants

AltStyle によって変換されたページ (->オリジナル) /