Troubleshooting
Table of contents
Common Issues
Configuration Not Found
Error: No [sql_testing] section found in pytest.ini, setup.cfg, or tox.ini
Causes:
- IDE running tests from wrong directory
- Configuration file in wrong location
- Missing configuration file
Solutions:
- Set environment variable:
export SQL_TESTING_PROJECT_ROOT=/path/to/project
- Create conftest.py in project root:
import os import pytest def pytest_configure(config): if not os.environ.get('SQL_TESTING_PROJECT_ROOT'): project_root = os.path.dirname(os.path.abspath(__file__)) os.environ['SQL_TESTING_PROJECT_ROOT'] = project_root
- Check working directory in IDE settings
Mock Table Not Found
Error: MockTableNotFoundError: Mock table 'users' not found
Causes:
- Table name mismatch
- Database/schema name mismatch
- Case sensitivity issues
Solutions:
- Verify table names match:
# In mock table def get_table_name(self) -> str: return "users" # Must match SQL query # In SQL query SELECT * FROM users # Table name must match
- Check database context:
# Mock table database def get_database_name(self) -> str: return "test_db" # TestCase namespace default_namespace="test_db" # Must match
Query Size Limit Exceeded
Error: QuerySizeLimitExceeded: Query exceeds 256KB limit
Automatic handling: Library switches to physical tables
Manual solutions:
- Force physical tables:
@sql_test(use_physical_tables=True) def test_large_dataset(): return TestCase(...)
- Reduce test data size:
# Instead of 10,000 rows, use representative sample mock_data = generate_sample_data(100)
Type Conversion Errors
Error: TypeConversionError: Cannot convert 'invalid_date' to date
Causes:
- Data type mismatch
- Invalid date/time formats
- Null handling issues
Solutions:
- Match Python and SQL types:
from datetime import date, datetime from decimal import Decimal @dataclass class Transaction: amount: Decimal # Not float transaction_date: date # Not string created_at: datetime # With timezone
- Handle nulls properly:
from typing import Optional @dataclass class User: email: Optional[str] # Can be None/NULL
Database Connection Issues
BigQuery
Error: google.auth.exceptions.DefaultCredentialsError
Solutions:
- Set credentials path:
[sql_testing.bigquery] credentials_path = /path/to/service-account.json
- Use application default credentials:
gcloud auth application-default login
Athena
Error: botocore.exceptions.NoCredentialsError
Solutions:
- Configure AWS credentials:
[sql_testing.athena] aws_access_key_id = YOUR_KEY aws_secret_access_key = YOUR_SECRET
- Use AWS CLI:
aws configure
Redshift
Error: psycopg2.OperationalError: FATAL: password authentication failed
Solutions:
- Verify credentials
- Check network/firewall access
- Ensure security group allows connections
Snowflake
Error: snowflake.connector.errors.DatabaseError: Invalid account identifier
Solutions:
- Use correct account format:
[sql_testing.snowflake] account = xy12345.us-west-2 # Include region
Array Type Issues
Error: Arrays not working as expected
Database-specific solutions:
- BigQuery:
# NULL arrays become empty arrays tags: List[str] = field(default_factory=list)
- Redshift:
# Arrays via JSON tags: List[str] # Stored as JSON string
- Athena/Trino:
-- Use UNNEST for array operations SELECT * FROM UNNEST(array_column) AS t(value)
Performance Issues
Slow Test Execution
Causes:
- Large datasets in CTE mode
- Network latency
- Inefficient queries
Solutions:
- Use physical tables for large data:
@sql_test(use_physical_tables=True)
- Optimize test data:
# Generate only necessary data def create_minimal_test_data(): return [row for row in data if row.is_relevant]
- Run tests in parallel:
pytest -n auto # Requires pytest-xdist
Memory Issues
Error: MemoryError
or slow performance
Solutions:
- Stream large results:
# Process results in chunks for chunk in pd.read_sql(query, con, chunksize=1000): process_chunk(chunk)
- Limit result size:
SELECT * FROM large_table LIMIT 1000
Debugging Tips
Use SQL Logging
The SQL Testing Library provides comprehensive SQL logging to help debug test failures:
# Enable logging for all tests
SQL_TEST_LOG_ALL=true pytest tests/
# Or enable for specific tests
@sql_test(log_sql=True)
def test_with_logging():
...
SQL logs are saved to <project_root>/.sql_logs/
and include:
- Complete transformed queries with CTEs or temp table SQL
- Full error messages and stack traces
- Test metadata and execution details
See the SQL Logging documentation for more details.
Enable Verbose Output
# See generated SQL
pytest -v -s test_file.py
# With captured output
pytest --capture=no
Inspect Generated CTE
@sql_test(mock_tables=[...])
def test_debug():
test_case = TestCase(
query="SELECT * FROM users",
default_namespace="test_db"
)
# Print generated query
print(test_case.query)
return test_case
Check Adapter Configuration
from sql_testing_library._pytest_plugin import SQLTestDecorator
decorator = SQLTestDecorator()
config = decorator._get_adapter_config("bigquery")
print(config)
Platform-Specific Issues
GitHub Actions
Issue: Tests pass locally but fail in CI
Solutions:
- Use same Python version
- Set timezone:
TZ=UTC
- Check secret/environment variables
Docker
Issue: Configuration not found in container
Solutions:
- Mount config file:
COPY pytest.ini /app/pytest.ini
- Set environment:
ENV SQL_TESTING_PROJECT_ROOT=/app
Getting Help
Resources
Debug Information
When reporting issues, include:
import sys
import sql_testing_library
print(f"Python: {sys.version}")
print(f"Library: {sql_testing_library.__version__}")
print(f"Platform: {sys.platform}")
# Adapter versions
try:
import google.cloud.bigquery
print(f"BigQuery: {google.cloud.bigquery.__version__}")
except ImportError:
pass
Minimal Reproducible Example
# test_minimal.py
from sql_testing_library import sql_test, TestCase
from sql_testing_library.mock_table import BaseMockTable
class MinimalMockTable(BaseMockTable):
def get_database_name(self) -> str:
return "test_db"
def get_table_name(self) -> str:
return "test_table"
@sql_test(
mock_tables=[MinimalMockTable([{"id": 1}])],
result_class=dict
)
def test_minimal():
return TestCase(
query="SELECT * FROM test_table",
default_namespace="test_db"
)