Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fixes to BANG to tolerate badly formed XML #935

Merged
merged 1 commit into from
Sep 23, 2024
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
26 changes: 20 additions & 6 deletions server/aap/io/feeding_services/bang.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@
# For the full copyright and license information, please see the
# AUTHORS and LICENSE files distributed with this source code, or
# at https://www.sourcefabric.org/superdesk/license
import logging

import lxml.etree
import requests
Expand All @@ -19,6 +20,8 @@
MOVIES_ID = "movies_url"
SHOWBIZ_ID = "showbiz_url"

logger = logging.getLogger(__name__)


class BangFeedingService(HTTPFeedingServiceBase):
NAME = "Bang"
Expand Down Expand Up @@ -62,13 +65,24 @@ def _update(self, provider, update):
for src in self.fields:
current_url = provider.get("config").get(src.get("id"))
if current_url:
feed_items = None
provider["current_id"] = src.get("id")
r = self.session.get(current_url)
r.raise_for_status()
xml = lxml.etree.fromstring(r.content)
item = parser.parse(xml, provider=provider)

items.append(item)
try:
r = self.session.get(current_url)
r.raise_for_status()

# Set the parser to be more tolerant due to stray quotes we get in attributes at times
xml_parser = lxml.etree.XMLParser(recover=True)
xml = lxml.etree.fromstring(r.content, xml_parser)
feed_items = parser.parse(xml, provider=provider)
except lxml.etree.XMLSyntaxError:
logger.exception(f"Syntax error parsing {current_url}")
# Anything goes wrong we log it and swallow it, so one bad feed doesn't kill them all!
except Exception:
logger.exception(f"Processing url {current_url}")

if feed_items:
items.append(feed_items)

if self.session:
self.session.close()
Expand Down
Loading