I finished creating an interface class to interact with a 3rd party gem. It is only used by a single class for now; however, establishing a good pattern early on is critical. Having a single class that interacts with the 3rd party gem allows me to create an integration tests that actually make network calls. Since it makes network calls to an external service, the integration spec will only run if you manually specify it on the command line. I do not include this as part of the continuous integration tests as I do not want to overload the external service. It is useful when developing as I can invoke an automated integration test after completing a feature. Right now, I have to run the commands and test it manually anyways; this integration tests automates it.
One issue I noticed is proper encapsulation is hard. That means I want ALL interaction with a 3rd party gem to go through this interface class but it is hard to encapsulate that fully. For example, the interface makes a request to an external service and return that as an object. Now, the class using the interface has a handle on that object and may invoke method calls or modify it. I need to freeze the returned object so you cannot invoke any methods or modify it.
A 3rd party integration spec is useful as it allows you to run a suite of tests when you finish with a feature.
Proper encapsulation of the interface class is difficult.
What surprised me is that Looker is not on here. “Looker is a business intelligence software and big data analytics platform that helps you explore, analyze and share real-time business analytics easily.”
Also, ELT is also not represented in the graph. ELT stands for Extract, Load, Transform. Instead of having a data engineering team extract the data, transform it and load it into a data warehouse, the team extracts and loads the raw data into a data warehouse. Then, analytics tool such as Looker can transform the data on the fly and analyze it straight from the data warehouse. I deal with complex data transformations so I am most curious to hear how Looker or other tools can handle complex transformations.
Finally, with so many tools and terminologies available, it is crucial to know when to use what. Factors include the data size, the skills, security, infrastructure, etc. There is so many exciting developments in this field!
Generally speaking, data science is a way of extracting value and insights from data using the powers of computer science and statistics applied to a specific field of study – Business over Broadway
I studied Biomedical Engineering and I work with data for my research. What makes data scientist different is the increased size of the data set and the variable structure of that data. Today, we have data from servers, logs, mobile, IoT, 3rd party integrations such as salesforce, chat, Zendesk, etc. To complicate matters, some of the data are not structured.
I’m still not sure how you can analyze data that are not structured. The data I have worked with have all been structured. Sometimes, I need to transform that data so it can be easily analyzed.
The sheer volume of recorded data is called Big Data since there is a lot of it; hence, Big. Better technology enables us to store and process Big Data. Businesses want to understand and analyze this massive data to gain competitive advantage.
At my work, security is also another factor. Massive data is collected and security measures have to be taken so analysis does not reveal sensitive information. The article did not mention security but it should be taken seriously in any organization working with sensitive data.
I am surprised to see IBM listed as the leader in Data Science Platforms. Living in SF, I thought it would be Google or Amazon. They have the expertise in house so they should be able to monetize that expertise. Perhaps it is more valuable to use that expertise on their core businesses than to provide services to other companies.
My interests in Big Data is automation. How can someone take all this data and generate something useful with the least amount of resources. That’s where technology and computer science comes in.
I started looking for practical data driven sites to follow. It’s tough. There are ‘professional bloggers’ who uses buzz words such as big data, machine learning, and AI to generate content. However, when I read the content, I’m nodding my head in agreement but it lacks practical advice. It’s good for water cooler talk; nothing else.
There are some potential good ones and I am listing them below. I have not fully vet them, but they show promise. Let me know your thoughts on these sites and if you have any good ones to share.