All Comments
TopTalkedBooks posted at August 20, 2017

This is always a struggle for me -- do I defy proper object-oriented design (e.g. option 1) or do I use an implementation that seems counterintuitive to the real world (e.g. option 2)?

Reality may be a good starting point for molding or evolving a design, but it is always a mistake to model an OO design to reality.

OO design is about interfaces, the objects that implement them, and the interaction between those objects (the messages they pass between them). Interfaces are contractual agreements between two components, modules, or software sub-systems. There are many qualities to an OO design but the most important quality to me is substitution. If I have an interface then the implementing code better adhere to it. But more importantly, if the implementation is swapped then the new implementation better adhere to it. Lastly, if the implementation is meant to be polymorphic then the various strategies and states of the polymorphic implementation better adhere to it.

Example 1

In mathematics a square is a rectangle. Sounds like a good idea to inherit class Square from class Rectangle. You do it and it leads to ruin. Why? Because the client's expectation or belief was violated. Width and height can vary idependently but Square violates that contract. I had a rectangle of dimension (10, 10) and I set the width to 20. Now I think I have a rectangle of dimension (20, 10) but the actual instance is a square instance with dimensions (20, 20) and I, the client, am in for a real big surprise. So now we have a violation of the Principle of Least Surprise.

Now you have buggy behavior, which leads to client code becoming complex as if statements are needed to work around the buggy behavior. You may also find your client code requiring RTTI to work around the buggy behavior by testing for conrete types (I have a reference to Rectange but I have to check if it is really a Square instance).

Example 2

In real life animals can be carnivores or herbivores. In real life meat and vegetables are food types. So you might think it is a good idea to have class Animal as a parent class for different animal types. You also think it is a good idea to have a FoodType parent class for class Meat and class Vegetable. Finally, you have class Animal sport a method called eat(), which accepts a FoodType as a formal argument.

Everything compiles, passes static analysis, and links. You run your program. What happens at runtime when a sub type of Animal, say a herbivore, recieves a FoodType that is an instance of the Meat class? Welcome to the world of covarience and contravarience. This is a problem for many programming languages. It's also an interesting and challenging problem for language designers.

In Conclusion...

So what do you do? You start with your problem domain, your user stories, your use cases, and your requirements. Let them drive design. Let them help you discover the entities you need to model into classes and interfaces. When you do you'll find that the end result isn't based on reality.

Check out Analysis Patterns by Martin Fowler. In there you'll see what drives his Object Oriented designs. It is mainly based on how his clients (medical people, financial people, etc.) perform their daily tasks. It has overlap with reality, but it isn't based or driven by reality.

TopTalkedBooks posted at August 20, 2017

From what I understand after our exchange, I would model (at the DB level) this as follows:

Every Item record has a "valid-from" field expressed as a date. Value may be null (this would indicate that the specific Item has not been validated yet by a user and therefore is "pending").

In any given point T in time the "snapshot" is identified by querying the items with max(valid-from)<= current-date.

If you want to mantain the concept of "release" (maybe the actual release date is meaningful for your specific scenario, or you just want to reduce the possibility of wrong inputs) you could create a releases table which has the following structure:

Release Name | Release date
-------------|-------------
 Labor Day   | 20110501
 Summer Camp | 20110607
 2012 preview| 20111210

So the user will be able to refer to a given "release" in terms of some event or codename (while still being able to see the actual date, of course) and the corresponding date will be copied to the "valid-from" field of the validated item.

Personally I'd prefer this approach to having release names or codes at the item level, but of course my solution may be a problem in case you have two distinct "releases" on the same date, and later decide to change one of the two in the future (e.g. postpone it, maybe).

You asked for a pattern, but I am not aware of a pattern for this kind of situations. Maybe you can find something in Fowler's works (I don't remember anything for this in his book, though - check the site, too)

Rechecked his site after writing this - maybe this chapter could be of some interest.

TopTalkedBooks posted at August 20, 2017

Martin Fowler has a very well thought out model for measurements and conversions and such in Analysis Patterns. Worth reviewing. I believe he recommended a Conversion Ratio object that would handle converting from one unit to another.

Top Books
We collected top books from hacker news, stack overflow, Reddit, which are recommended by amazing people.