2 e-mails rejeitados pelo 2026.4.0-latest (5f0d95e1f1)

Olá, estou observando uma falha intermitente no processamento de e-mails recebidos em um site auto-hospedado usando mail-receiver.

O que aconteceu

Dois e-mails recebidos distintos:

  • foram entregues ao endereço Discourse correto
  • aparecem em /admin/email/received
  • também aparecem em /admin/email/rejected
  • exibem o texto genérico de rejeição: “Houve um erro não reconhecido ao processar seu e-mail e ele não foi publicado.”

Por volta da mesma época, os logs mostraram ActiveRecord::Deadlocked.

Por que acredito que isso possa ser um bug

Comparei os e-mails rejeitados com um e-mail bem-sucedido posterior de formato muito similar:

  • mesmo padrão de remetente
  • mesmo caminho Microsoft / Power Automate
  • mesmo formato visível de To:
  • mesmo destinatário do envelope SMTP para o endereço Discourse

Portanto, isso não parece ser um simples problema de configuração com mail-receiver.

Evidências

Para ambos os e-mails rejeitados:

  • foram entregas separadas com Message-IDs diferentes
  • tinham IDs de fila Postfix diferentes
  • ambos foram entregues for <ppyem3.accomodation@discourse.domain.com>

Também observei ActiveRecord::Deadlocked nos logs durante o mesmo período.

Contexto de hospedagem

  • auto-hospedado
  • usando mail-receiver
  • pequeno VPS IONOS
  • 1 worker web configurado

Percebo que o 1 worker web pode ser apenas contexto de fundo, e não a causa, já que isso parece ser processamento de e-mail recebido/em segundo plano e não tratamento de solicitações web.

Se for útil, posso fornecer:

  • cabeçalhos anonimizados dos 2 e-mails rejeitados
  • cabeçalhos anonimizados de um e-mail comparável bem-sucedido
  • a entrada de log do deadlock

browsing the /logs endpoint, i see the two errors

env

hostname	ubuntu-app
process_id	2297
application_version	5f0d95e1f11a9cb5acbe8c7a61142400fc506f06
job	Jobs::ProcessEmail
db	default
time	7:48 pm
backtrace
activesupport-8.0.5/lib/active_support/broadcast_logger.rb:218:in 'block in ActiveSupport::BroadcastLogger#dispatch' 
activesupport-8.0.5/lib/active_support/broadcast_logger.rb:217:in 'Array#map' 
activesupport-8.0.5/lib/active_support/broadcast_logger.rb:217:in 'ActiveSupport::BroadcastLogger#dispatch' 
activesupport-8.0.5/lib/active_support/broadcast_logger.rb:129:in 'ActiveSupport::BroadcastLogger#error' 
/var/www/discourse/lib/email/processor.rb:126:in 'Email::Processor#handle_failure' 
/var/www/discourse/lib/email/processor.rb:31:in 'Email::Processor#process!' 
/var/www/discourse/lib/email/processor.rb:13:in 'Email::Processor.process!' 
/var/www/discourse/app/jobs/regular/process_email.rb:8:in 'Jobs::ProcessEmail#execute' 
/var/www/discourse/app/jobs/base.rb:318:in 'block (2 levels) in Jobs::Base#perform' 
rails_multisite-7.0.0/lib/rails_multisite/connection_management/null_instance.rb:49:in 'RailsMultisite::ConnectionManagement::NullInstance#with_connection'
rails_multisite-7.0.0/lib/rails_multisite/connection_management.rb:17:in 'RailsMultisite::ConnectionManagement.with_connection'
/var/www/discourse/app/jobs/base.rb:305:in 'block in Jobs::Base#perform' 
/var/www/discourse/app/jobs/base.rb:301:in 'Array#each' 
/var/www/discourse/app/jobs/base.rb:301:in 'Jobs::Base#perform' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:220:in 'Sidekiq::Processor#execute_job' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:185:in 'block (4 levels) in Sidekiq::Processor#process' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:180:in 'Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:183:in 'block in Sidekiq::Middleware::Chain#traverse' 
/var/www/discourse/lib/sidekiq/suppress_user_email_errors.rb:6:in 'Sidekiq::SuppressUserEmailErrors#call' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:182:in 'Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:183:in 'block in Sidekiq::Middleware::Chain#traverse' 
/var/www/discourse/lib/sidekiq/discourse_event.rb:6:in 'Sidekiq::DiscourseEvent#call' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:182:in 'Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:183:in 'block in Sidekiq::Middleware::Chain#traverse' 
/var/www/discourse/lib/sidekiq/pausable.rb:131:in 'Sidekiq::Pausable#call' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:182:in 'Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:183:in 'block in Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/job/interrupt_handler.rb:9:in 'Sidekiq::Job::InterruptHandler#call' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:182:in 'Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:183:in 'block in Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/metrics/tracking.rb:26:in 'Sidekiq::Metrics::ExecutionTracker#track' 
sidekiq-7.3.10/lib/sidekiq/metrics/tracking.rb:134:in 'Sidekiq::Metrics::Middleware#call' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:182:in 'Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:173:in 'Sidekiq::Middleware::Chain#invoke' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:184:in 'block (3 levels) in Sidekiq::Processor#process' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:145:in 'block (6 levels) in Sidekiq::Processor#dispatch' 
sidekiq-7.3.10/lib/sidekiq/job_retry.rb:118:in 'Sidekiq::JobRetry#local' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:144:in 'block (5 levels) in Sidekiq::Processor#dispatch' 
sidekiq-7.3.10/lib/sidekiq/config.rb:39:in 'block in <class:Config>' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:139:in 'block (4 levels) in Sidekiq::Processor#dispatch' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:281:in 'Sidekiq::Processor#stats' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:134:in 'block (3 levels) in Sidekiq::Processor#dispatch' 
sidekiq-7.3.10/lib/sidekiq/job_logger.rb:15:in 'Sidekiq::JobLogger#call' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:133:in 'block (2 levels) in Sidekiq::Processor#dispatch' 
sidekiq-7.3.10/lib/sidekiq/job_retry.rb:85:in 'Sidekiq::JobRetry#global' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:132:in 'block in Sidekiq::Processor#dispatch' 
sidekiq-7.3.10/lib/sidekiq/job_logger.rb:40:in 'Sidekiq::JobLogger#prepare' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:131:in 'Sidekiq::Processor#dispatch' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:183:in 'block (2 levels) in Sidekiq::Processor#process' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:182:in 'Thread.handle_interrupt' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:182:in 'block in Sidekiq::Processor#process' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:181:in 'Thread.handle_interrupt' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:181:in 'Sidekiq::Processor#process' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:86:in 'Sidekiq::Processor#process_one' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:76:in 'Sidekiq::Processor#run' 
sidekiq-7.3.10/lib/sidekiq/component.rb:10:in 'Sidekiq::Component#watchdog' 
sidekiq-7.3.10/lib/sidekiq/component.rb:19:in 'block in Sidekiq::Component#safe_thread'
info
Unrecognized error type (ActiveRecord::Deadlocked: PG::TRDeadlockDetected: ERROR:  deadlock detected
DETAIL:  Process 1170676 waits for ShareLock on transaction 3318694; blocked by process 1169876.
Process 1169876 waits for ExclusiveLock on tuple (4,27) of relation 74547 of database 16384; blocked by process 1170546.
Process 1170546 waits for ShareLock on transaction 3318690; blocked by process 1170676.
HINT:  See server log for query details.
CONTEXT:  while updating tuple (1,54) in relation "user_stats"
) when processing incoming email

Backtrace:
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/rack-mini-profiler-4.0.1/lib/patches/db/pg/alias_method.rb:109:in 'PG::Connection#exec'
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/rack-mini-profiler-4.0.1/lib/patches/db/pg/alias_method.rb:109:in 'PG::Connection#async_exec'
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/activerecord-8.0.5/lib/active_record/connection_adapters/postgresql/database_statements.rb:167:in 'ActiveRecord::ConnectionAdapters::PostgreSQL::DatabaseStatements#perform_query'
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/activerecord-8.0.5/lib/active_record/connection_adapters/abstract/database_statements.rb:556:in 'block (2 levels) in ActiveRecord::ConnectionAdapters::DatabaseStatements#raw_execute'
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/activerecord-8.0.5/lib/active_record/connection_adapters/abstract_adapter.rb:1022:in 'block in ActiveRecord::ConnectionAdapters::AbstractAdapter#with_raw_connection'
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/activesupport-8.0.5/lib/active_support/concurrency/null_lock.rb:9:in 'ActiveSupport::Concurrency::NullLock#synchronize'
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/activerecord-8.0.5/lib/active_record/connection_adapters/abstract_adapter.rb:991:in 'ActiveRecord::ConnectionAdapters::AbstractAdapter#with_raw_connection'
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/activerecord-8.0.5/lib/active_record/connection_ad...

env

hostname	ubuntu-app
process_id	2297
application_version	5f0d95e1f11a9cb5acbe8c7a61142400fc506f06
job	Jobs::ProcessEmail
db	default
time	7:48 pm
backtrace
activesupport-8.0.5/lib/active_support/broadcast_logger.rb:218:in 'block in ActiveSupport::BroadcastLogger#dispatch' 
activesupport-8.0.5/lib/active_support/broadcast_logger.rb:217:in 'Array#map' 
activesupport-8.0.5/lib/active_support/broadcast_logger.rb:217:in 'ActiveSupport::BroadcastLogger#dispatch' 
activesupport-8.0.5/lib/active_support/broadcast_logger.rb:129:in 'ActiveSupport::BroadcastLogger#error' 
/var/www/discourse/lib/email/processor.rb:126:in 'Email::Processor#handle_failure' 
/var/www/discourse/lib/email/processor.rb:31:in 'Email::Processor#process!' 
/var/www/discourse/lib/email/processor.rb:13:in 'Email::Processor.process!' 
/var/www/discourse/app/jobs/regular/process_email.rb:8:in 'Jobs::ProcessEmail#execute' 
/var/www/discourse/app/jobs/base.rb:318:in 'block (2 levels) in Jobs::Base#perform' 
rails_multisite-7.0.0/lib/rails_multisite/connection_management/null_instance.rb:49:in 'RailsMultisite::ConnectionManagement::NullInstance#with_connection'
rails_multisite-7.0.0/lib/rails_multisite/connection_management.rb:17:in 'RailsMultisite::ConnectionManagement.with_connection'
/var/www/discourse/app/jobs/base.rb:305:in 'block in Jobs::Base#perform' 
/var/www/discourse/app/jobs/base.rb:301:in 'Array#each' 
/var/www/discourse/app/jobs/base.rb:301:in 'Jobs::Base#perform' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:220:in 'Sidekiq::Processor#execute_job' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:185:in 'block (4 levels) in Sidekiq::Processor#process' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:180:in 'Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:183:in 'block in Sidekiq::Middleware::Chain#traverse' 
/var/www/discourse/lib/sidekiq/suppress_user_email_errors.rb:6:in 'Sidekiq::SuppressUserEmailErrors#call' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:182:in 'Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:183:in 'block in Sidekiq::Middleware::Chain#traverse' 
/var/www/discourse/lib/sidekiq/discourse_event.rb:6:in 'Sidekiq::DiscourseEvent#call' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:182:in 'Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:183:in 'block in Sidekiq::Middleware::Chain#traverse' 
/var/www/discourse/lib/sidekiq/pausable.rb:131:in 'Sidekiq::Pausable#call' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:182:in 'Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:183:in 'block in Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/job/interrupt_handler.rb:9:in 'Sidekiq::Job::InterruptHandler#call' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:182:in 'Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:183:in 'block in Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/metrics/tracking.rb:26:in 'Sidekiq::Metrics::ExecutionTracker#track' 
sidekiq-7.3.10/lib/sidekiq/metrics/tracking.rb:134:in 'Sidekiq::Metrics::Middleware#call' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:182:in 'Sidekiq::Middleware::Chain#traverse' 
sidekiq-7.3.10/lib/sidekiq/middleware/chain.rb:173:in 'Sidekiq::Middleware::Chain#invoke' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:184:in 'block (3 levels) in Sidekiq::Processor#process' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:145:in 'block (6 levels) in Sidekiq::Processor#dispatch' 
sidekiq-7.3.10/lib/sidekiq/job_retry.rb:118:in 'Sidekiq::JobRetry#local' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:144:in 'block (5 levels) in Sidekiq::Processor#dispatch' 
sidekiq-7.3.10/lib/sidekiq/config.rb:39:in 'block in <class:Config>' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:139:in 'block (4 levels) in Sidekiq::Processor#dispatch' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:281:in 'Sidekiq::Processor#stats' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:134:in 'block (3 levels) in Sidekiq::Processor#dispatch' 
sidekiq-7.3.10/lib/sidekiq/job_logger.rb:15:in 'Sidekiq::JobLogger#call' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:133:in 'block (2 levels) in Sidekiq::Processor#dispatch' 
sidekiq-7.3.10/lib/sidekiq/job_retry.rb:85:in 'Sidekiq::JobRetry#global' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:132:in 'block in Sidekiq::Processor#dispatch' 
sidekiq-7.3.10/lib/sidekiq/job_logger.rb:40:in 'Sidekiq::JobLogger#prepare' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:131:in 'Sidekiq::Processor#dispatch' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:183:in 'block (2 levels) in Sidekiq::Processor#process' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:182:in 'Thread.handle_interrupt' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:182:in 'block in Sidekiq::Processor#process' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:181:in 'Thread.handle_interrupt' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:181:in 'Sidekiq::Processor#process' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:86:in 'Sidekiq::Processor#process_one' 
sidekiq-7.3.10/lib/sidekiq/processor.rb:76:in 'Sidekiq::Processor#run' 
sidekiq-7.3.10/lib/sidekiq/component.rb:10:in 'Sidekiq::Component#watchdog' 
sidekiq-7.3.10/lib/sidekiq/component.rb:19:in 'block in Sidekiq::Component#safe_thread' 
info
Unrecognized error type (ActiveRecord::Deadlocked: PG::TRDeadlockDetected: ERROR:  deadlock detected
DETAIL:  Process 1170678 waits for ShareLock on transaction 3318690; blocked by process 1170676.
Process 1170676 waits for ShareLock on transaction 3318694; blocked by process 1169876.
Process 1169876 waits for ExclusiveLock on tuple (4,27) of relation 74547 of database 16384; blocked by process 1170678.
HINT:  See server log for query details.
CONTEXT:  while locking tuple (4,27) in relation "categories"
) when processing incoming email

Backtrace:
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/rack-mini-profiler-4.0.1/lib/patches/db/pg/alias_method.rb:109:in 'PG::Connection#exec'
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/rack-mini-profiler-4.0.1/lib/patches/db/pg/alias_method.rb:109:in 'PG::Connection#async_exec'
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/activerecord-8.0.5/lib/active_record/connection_adapters/postgresql/database_statements.rb:167:in 'ActiveRecord::ConnectionAdapters::PostgreSQL::DatabaseStatements#perform_query'
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/activerecord-8.0.5/lib/active_record/connection_adapters/abstract/database_statements.rb:556:in 'block (2 levels) in ActiveRecord::ConnectionAdapters::DatabaseStatements#raw_execute'
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/activerecord-8.0.5/lib/active_record/connection_adapters/abstract_adapter.rb:1022:in 'block in ActiveRecord::ConnectionAdapters::AbstractAdapter#with_raw_connection'
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/activesupport-8.0.5/lib/active_support/concurrency/null_lock.rb:9:in 'ActiveSupport::Concurrency::NullLock#synchronize'
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/activerecord-8.0.5/lib/active_record/connection_adapters/abstract_adapter.rb:991:in 'ActiveRecord::ConnectionAdapters::AbstractAdapter#with_raw_connection'
  /var/www/discourse/vendor/bundle/ruby/3.4.0/gems/activerecord-8.0.5/lib/active_record/connection_ada...

Olá, obrigado por dedicar um tempo para analisar isso. Quero garantir que reúna os dados adicionais corretos antes de executar qualquer coisa que possa sobrecarregar o servidor.

Com base no deadlock e na forma como Jobs::ProcessEmail é executado, acredito que o seguinte possa ajudar a estreitar a investigação, mas gostaria de uma confirmação antes de prosseguir.


1. Comparação de cabeçalhos de e-mail (falha vs. sucesso)

Posso extrair e postar cabeçalhos brutos anonimizados de:

  • /admin/email/received → “Mostrar original”

Para:

  • 1 e-mail com falha

ou ambos, conforme solicitado em

  • 1 e-mail bem-sucedido com o mesmo padrão

conforme solicitado em

Isso incluiria:

  • Message-ID
  • From
  • To
  • Delivered-To
  • Return-Path
  • as principais linhas Received:

:backhand_index_pointing_right: Essa comparação seria útil aqui?


2. Resultado do job Sidekiq (retry / dead set)

Como isso está ocorrendo dentro de Jobs::ProcessEmail, posso verificar se os jobs:

  • foram reprocessados (retry)
  • eventualmente tiveram sucesso
  • ou foram para o dead set

Usando:

./launcher enter app
rails c
Sidekiq::RetrySet.new.select { |j| j.klass == "Jobs::ProcessEmail" }
Sidekiq::DeadSet.new.select { |j| j.klass == "Jobs::ProcessEmail" }

:backhand_index_pointing_right: Confirmar o comportamento de retry/dead-set ajudaria a diagnosticar isso?


  1. Configuração de concorrência

Mentionei que estou executando com 1 worker web, mas percebo que a concorrência do Sidekiq provavelmente é mais relevante para um deadlock.

Posso compartilhar os valores relevantes do app.yml, por exemplo:

UNICORN_WORKERS:
SIDEKIQ_CONCURRENCY:

No momento, não tenho SIDEKIQ_CONCURRENCY definido explicitamente no app.yml, então acredito que esteja rodando com a concorrência padrão (ou seja, múltiplas threads do Sidekiq).

:backhand_index_pointing_right: Isso seria um contexto importante para interpretar o deadlock?


  1. Versão do PostgreSQL

Como se trata de um PG::TRDeadlockDetected, também posso confirmar a versão exata do PostgreSQL em uso.

:backhand_index_pointing_right: Vale a pena incluir essa informação?


  1. Horário de recebimento dos e-mails

Observei que os dois e-mails com falha foram processados muito próximos no tempo (no mesmo minuto).

Posso extrair horários mais precisos dos logs se for útil.

:backhand_index_pointing_right: Uma correlação temporal mais precisa ajudaria a confirmar se isso é um problema disparado por concorrência?


Se for útil, posso reunir e postar tudo isso em uma resposta subsequente — apenas queria verificar o que seria mais útil primeiro.

Obrigado novamente :+1: