-
Notifications
You must be signed in to change notification settings - Fork 17
Description
I have a case class that looks like this:
case class RootElement(
foo: Option[Foo],
bar: Option[Bar],
baz: Option[Baz]
)
Foo, Bar, and Baz are all large and deeply-nested types, and I noticed that while writing a parser test, I was parsing a RootElement(None, None, None). Eventually I realized that validation errors way down in the child elements were being turned into None.
I was surprised to see that was the implementation of the default option reader, and am curious how this implementation came about. Would you be open to an alternate implementation? I am thinking something like this, that would convert a ParseFailure with only EmptyErrors to a None:
implicit def optionReader[A](implicit reader: XmlReader[A]): XmlReader[Option[A]] = XmlReader { xml =>
reader.read(xml) match {
case ParseSuccess(s) => ParseSuccess(Option(s))
case PartialParseSuccess(v, e) => PartialParseSuccess(Option(v), e)
case ParseFailure(f) => if (f.forall {
case _: EmptyError => true
case _ => false
}) ParseSuccess(None) else ParseFailure(f)
}
}
I realize there's a significant amount of live code using xtract, and this is almost certainly a breaking change. Maybe it would be easier to remove the implicit modifier from the provided reader and/or provide both side-by-side for users to choose from?
Thanks
Joe