Problem Summary
Facebook Sharing Debugger consistently returns a 403 error for URLs on my website, even though our server logs confirm that Facebook's crawler received a 200 OK response with valid OG meta tags.
Environment
Domain: https://www.fullstackfamily.com
What I've Verified
Server Logs (GCP Load Balancer) Request URL: https://www.fullstackfamily.com/news/2848 Status: 200 Remote IP: 173.252.127.24 (Facebook crawler IP, AS32934) User-Agent: facebookexternalhit/1.1 Timestamp: 2026-01-06T21:31:17Z
Response Content (verified via curl)
robots.txt
Explicitly allows facebookexternalhit
Other OG validators (opengraph.xyz): Show correct OG data
Facebook Debugger Output
Response Code: 403
og:title: Shows domain name instead of actual title
The Mystery
GCP logs show 200 OK response to Facebook crawler
Tested with brand new URLs never shared before - same 403 error
Questions
Any help would be greatly appreciated. Thank you!

same here, with s3 and my website local store as fallback storage.
Instagram Graph API media creation is failing with code 9004 / subcode 2207052 for publicly accessible S3-hosted JPEG files.
What we tested: - Same IG user ID and same access token - Same /{ig-user-id}/media call - External image URL works - Our own S3 public image URL fails - S3 object returns HTTP 200 - Content-Type is image/jpeg - File is publicly downloadable - This started suddenly on April 1–2, 2026 with no infrastructure or code changes - It affects multiple client accounts
Observed error: code: 9004 error_subcode: 2207052 message: Only photo or video can be accepted as media type. error_user_title: Media download has failed. The media URI doesn't meet our requirements.
Evidence: - Same token works with third-party URL - Public S3 URL fails - Sharing/Meta debugging shows fetch issues on our origin

Same problem here.
Logs clearly show facebookexternalhit getting a 200 response. We also updated our robots.txt yesterday to explicitly allow full access to facebookexternalhit, still no luck.


Should clarify, same issue, but AWS and a standard load balancer.

After testing many subdomains in the share-debugger (to get fresh robots.txts every time):
Every once in a while Facebook sends a "range: bytes=0-512000" header, which results in a 206 "Partial Content" status code, and in that case the whole process works, no 403 error. All other times, we are sending back a 200 status code, resulting in failure.

Should clarify. I was observing facebookexternalhit grabbing the robots.txt. There are two scenarios that happen:
Facebookexternalhit sets a "range: bytes=0-512000" header while requesting robots.txt. We return a status code 206. The URL being shared gets scraped, everything works.
Facebookexternhit does NOT set the range header while requesting robots.txt). We return a status code 200. Facebook's crawler claims it received a 403 status code.
same problem!
