
Many professionals see the word “informatics” and think of one of two things. First, what the heck is it? Second, isn’t that just computer science? While the two certainly are similar and often used interchangeably, they are quite different. Let’s take a deeper dive and see what the field of informatics entails, how it can be applied to computer science and business, and why it’s important to consider for your organization. Defining Informatics Pinning down informatics is a bit of a tricky one, as it is most commonly used when referring to healthcare. In regards to medical informatics, Merriam-Webster defines it as “the collection, classification, storage, retrieval, and dissemination of recorded knowledge.” Now, we know what you’re thinking. Wouldn't that definition be applicable in just about any other context? Well, you’re not the first one to think of this definition in a fluid manner. The definition of informatics has shifted throughout the years to reflect this more abstract line of thought. Generally speaking, informatics can be referred to as the study of any system, artificial or natural, and how it shares or processes information of some sort. If we zoom out a bit with our definition, you can see how informatics...