Recently in my practice, I faced a significant challenge: extracting application logs per day from Loki in the Kubernetes environment. When cold, the application writes about 60 lines per minute, and when someone interacts with the application, it can write 2000-5000 lines of logs per minute – it turns out that was necessary to get more than 300,000 lines of logs. The setup did not include a configured log export, and the primary log viewing tool was Grafana, which imposes a 5000-line limit on log retrieval. Increasing this limit was not feasible as it would significantly strain our resources and was unnecessary for this one-time task. Additionally, accessing logs directly from the Kubernetes pod was not an option due to storage limitations within the pod itself. So, I need to download logs directly from Loki without changing configurations. Preparation Used tools LogCli Kubectl Additional steps To ensure that the query will use to search for logs is correct, follow these steps: Navigate to Grafana explore: Go to Grafana > Explore Set the required label: Apply the necessary label to filter logs by the service. Filter by date: Use the operation filter to display lines containing the desired date. Example query: {instance="our-service"} |= `2024-07-12` Execution Install LogCli: Download the LogCli binary from the Loki releases page Set Loki address: Configure the Loki address for LogCli using an environment variable: export LOKI_ADDR=http://localhost:8000 Port forwarding: Forward local ports to the Loki pod to allow local access: kubectl --namespace loki port-forward svc/loki-stack 8000:3100 Extract logs: Use LogCli to query and save the logs to a file: logcli query '{instance="our-service"} |= `2024-07-12`' --limit=5000000 --since=72h -o raw > our-service-2024-07-12.log In this command: --limit is set with a high value to ensure all logs are captured. --since is set to 72 hours to cover a sufficient time range. Conclusion This entire process took approximately 10 minutes, resulting in a file with the complete application logs for the specified date. If needed, this process can be further optimized or automated. Recently in my practice, I faced a significant challenge: extracting application logs per day from Loki in the Kubernetes environment. When cold, the application writes about 60 lines per minute, and when someone interacts with the application, it can write 2000-5000 lines of logs per minute – it turns out that was necessary to get more than 300,000 lines of logs. The setup did not include a configured log export, and the primary log viewing tool was Grafana, which imposes a 5000-line limit on log retrieval. Increasing this limit was not feasible as it would significantly strain our resources and was unnecessary for this one-time task. Additionally, accessing logs directly from the Kubernetes pod was not an option due to storage limitations within the pod itself. So, I need to download logs directly from Loki without changing configurations. Preparation Used tools LogCli Kubectl LogCli LogCli LogCli Kubectl Kubectl Kubectl Additional steps To ensure that the query will use to search for logs is correct, follow these steps: Navigate to Grafana explore: Go to Grafana > Explore Set the required label: Apply the necessary label to filter logs by the service. Filter by date: Use the operation filter to display lines containing the desired date. Navigate to Grafana explore: Go to Grafana > Explore Go to Grafana > Explore Go to Grafana > Explore Grafana Explore Set the required label: Apply the necessary label to filter logs by the service. Apply the necessary label to filter logs by the service. Apply the necessary label to filter logs by the service. label Filter by date: Use the operation filter to display lines containing the desired date. Use the operation filter to display lines containing the desired date. Use the operation filter to display lines containing the desired date. Use the operation filter to display lines containing the desired date. operation Example query: Example query: {instance="our-service"} |= `2024-07-12` {instance="our-service"} |= `2024-07-12` Execution Install LogCli: Download the LogCli binary from the Loki releases page Set Loki address: Configure the Loki address for LogCli using an environment variable: export LOKI_ADDR=http://localhost:8000 Port forwarding: Forward local ports to the Loki pod to allow local access: kubectl --namespace loki port-forward svc/loki-stack 8000:3100 Extract logs: Use LogCli to query and save the logs to a file: logcli query '{instance="our-service"} |= `2024-07-12`' --limit=5000000 --since=72h -o raw > our-service-2024-07-12.log In this command: --limit is set with a high value to ensure all logs are captured. --since is set to 72 hours to cover a sufficient time range. Install LogCli: Download the LogCli binary from the Loki releases page Install LogCli: Download the LogCli binary from the Loki releases page Download the LogCli binary from the Loki releases page Loki releases page Loki releases page Set Loki address: Configure the Loki address for LogCli using an environment variable: export LOKI_ADDR=http://localhost:8000 Set Loki address: Configure the Loki address for LogCli using an environment variable: Configure the Loki address for LogCli using an environment variable: export LOKI_ADDR=http://localhost:8000 export LOKI_ADDR=http://localhost:8000 Port forwarding: Forward local ports to the Loki pod to allow local access: kubectl --namespace loki port-forward svc/loki-stack 8000:3100 Port forwarding: Forward local ports to the Loki pod to allow local access: Forward local ports to the Loki pod to allow local access: kubectl --namespace loki port-forward svc/loki-stack 8000:3100 kubectl --namespace loki port-forward svc/loki-stack 8000:3100 Extract logs: Use LogCli to query and save the logs to a file: logcli query '{instance="our-service"} |= `2024-07-12`' --limit=5000000 --since=72h -o raw > our-service-2024-07-12.log In this command: --limit is set with a high value to ensure all logs are captured. --since is set to 72 hours to cover a sufficient time range. Extract logs: Use LogCli to query and save the logs to a file: Use LogCli to query and save the logs to a file: logcli query '{instance="our-service"} |= `2024-07-12`' --limit=5000000 --since=72h -o raw > our-service-2024-07-12.log logcli query '{instance="our-service"} |= `2024-07-12`' --limit=5000000 --since=72h -o raw > our-service-2024-07-12.log In this command: --limit is set with a high value to ensure all logs are captured. --since is set to 72 hours to cover a sufficient time range. In this command: --limit is set with a high value to ensure all logs are captured. --since is set to 72 hours to cover a sufficient time range. --limit is set with a high value to ensure all logs are captured. --since is set to 72 hours to cover a sufficient time range. --limit is set with a high value to ensure all logs are captured. --limit is set with a high value to ensure all logs are captured. --since is set to 72 hours to cover a sufficient time range. --since is set to 72 hours to cover a sufficient time range. Conclusion This entire process took approximately 10 minutes, resulting in a file with the complete application logs for the specified date. If needed, this process can be further optimized or automated.