Categories
Uncategorised

VTiger vs Odoo

Open Source CRMs for 2020


Two Open Source Customer Relationship Management (CRM) Systems to consider in 2020

Over the years I’ve had the privilege of working with (and customizing) numerous CRM systems for a variety of businesses. Since a client had recently asked me which I would currently recommend, I thought it was high time I checked up on the various pros and cons from two of my favorites and lay them out for my clients and affiliates.

Both packages are ‘Open Source’ meaning the ‘community versions’ of their source code is freely available to use and modify by anyone. However both companies also provide the software as a service (SaaS) that encompasses hosting and support on their own servers for a monthly or yearly subscription.  Another revenue generation tool for both systems is via additional paid modules that add functionality to the base systems.

Both packages are therefore running remotely on a server (in the cloud) and your data/files are stored within remote databases and distributed storage clusters.  Users access the software via their browser or mobile devices.

Realistically the target audience for both systems is Small to Medium sized Businesses (SME/SMB) with between 5 to 50 users.  Although Odoo is especially capable of scaling well.  

At it stands, at the end of 2019, Odoo and VTiger are rated somewhat similarly on various rating websites.
Comparing specifications of an iPad 2018 model with its 2019 counterpart is one thing but, in my opinion, comparison websites that attempt to place complex software side by side end up comparing apples with pears and it can often be misleading. 

External Reviews

 
TrustRadius.com

https://www.trustradius.com/compare-products/odoo-vs-vtiger

Odoo  (8.1 / 10)

VTiger (7.2 / 10)


SoftwareAdvice.com

https://www.softwareadvice.com/crm/odoo-profile/vs/vtiger/

Odoo  (4.23 / 5)

VTiger (4.35 / 5)

User Interface

Personally I prefer the User Interface of Odoo, it feels more modern and certain aspects like the Kanban Pipeline work really well, blending simplicity with a perfect level of functionality for most users.

Development Potential

extending the system for client needs

As a software developer, I was originally swayed by VTiger due to the programming language (PHP) rather than Odoo (Python).  This was a simple business decision based on the availability of PHP developers and based on my own skill sets and additional experience in PHP.  Although in recent years I would say that is less of a concern within the market.


The nature of the product being sold will drive how the system is used.  Certain clients may have a real interest in the reporting functionality and expect to generate a great deal of management information from sales figures. Both Odoo and VTiger boast excellent reporting tools and allow filters to be applied to the sales data within a large range of dimensions.  Other clients are interested in linking their website to the CRM and creating a pipeline of leads from external inbound marketing platforms.

Sharing of data and collaboration with colleagues is important in business that have larger sales teams or more hierarchical internal staffing structures.


The end result will vary for each client.  There really is no such thing as a standard CRM system – every company will have a different set of priorities and the final implementation will function very differently. 

Categories
Corporate Change

Leading the Charge

Rather than large global corporates leading the charge with IT systems, for many years they have been falling further and further behind.

Photo by Pixabay from Pexels

This is not just in contrast to tech startups and their ability to compete in developing cutting edge technology.  What I’m referring to here is the corporate’s increasing inability to support, modify or even understand their own legacy technologies that they themselves have built over the last two decades. All aspects of IT are developing within exponential time frames which demands ever faster responses and yet we are at a juxtaposition of conflicting trends that are rendering the corporate ever more useless in a contemporary environment. 

Yes IT systems have indeed been present in businesses for many decades but the importance of their role has only gradually become mission critical. Systems that were trusted to run entire global organisations at the turn of the century are now legacy systems and, guess what, the people who built the technology are long gone. This might seem extreme but the corporate memory is indeed getting shorter as employee turnover becomes ever more rapid.

Today audacious IT initiatives and accompanied project failure within large organisations is incredibly high.

https://www.consultancy.uk/news/18243/lidl-cancels-sap-introduction-having-sunk-500-million-into-it

The interesting thing here is why corporates were so successful in building systems 20 years ago but not today and I believe the reason is simple; corporates didn’t build software, motivated people did.  In those days company software initiatives were far less audacious. No one set out to build software that ran the organization, systems grew organically to fill this role and it happened over many years. Again I would reiterate that in most cases this was not conducted under the watchful initiative of the corporate brain but by innovative individuals. 


Today’s corporates feel the pressure to deliver new software and attempt to replace these systems through large scale IT projects orchestrated at global levels (in contrast to the maverick developer). 
However you cannot simply demand a mighty oak you must grow it from a seed and you might as well forget it if you don’t really understand what a tree is.

Categories
Corporate Change

Guardians of Logic

When I first began writing this article I considered the title “Corporate Memory Loss” but in the spirit of positivity I decided a more constructive slant was in order. So what exactly is the issue I’m referring to?

Article by Simon Challinor
Photo by Chinmay Singh from Pexels

Corporate Memory Loss

Society has changed and today’s employees have a progressively different view on employment than previous generations.  It’s common for employees to want to change roles and companies far more frequently, whether it be motivated by new challenges, more money or a prestigious new job title on linked-in.

From the perspective of a business this presents numerous challenges.  The cost of simply hiring and re-hiring for the same positions coupled with repeated re-training is an obvious financial burden but perhaps the more pressing issues are related to knowledge transfer and business continuity.

In severe cases companies face major knowledge loss through each generation, causing premature legacy within valuable systems. We see the following characteristics :

  • A constant need to “reinvent the wheel”
  • Software becoming legacy too quickly
  • Recreating the same systems (software or infrastructure) in new technologies without full justification
  • Repetition in “discovering” or “re-discovering” business logic
  • Key processes being forgotten or ignored
  • Constant “Fire-Fighting”/ Troubleshooting of older systems

If these things seem familiar your company may be suffering from Corporate Amnesia.

However this isn’t a new phenomenon, the software industry has been (or should be) aware of the problem for decades – but the big banks, insurance companies, supermarkets … they aren’t software companies. If you want to know how the Netscape browser finally came to an end there’s an interesting read from Joel Spolsky written all the way back in 2000

The idea that new code is better than old is patently absurd. Old code has been used. It has been testedLots of bugs have been found, and they’ve been fixed. There’s nothing wrong with it. It doesn’t acquire bugs just by sitting around on your hard drive. Au contraire, baby! Is software supposed to be like an old Dodge Dart, that rusts just sitting in the garage? Is software like a teddy bear that’s kind of gross if it’s not made out of all new material?

APRIL 6, 2000 by JOEL SPOLSKY

TDR-encompass focuses on assisting companies retain business logic by using technology wisely, documentation of processes and through the avoidance of legacy. 

Categories
Infrastructure

Virtual Computers

What is a virtual computer and what role does it play?


Virtual hardware utilizes a collection of computing resources such as the memory, processor and storage of a real physical computer. This is often referred to as a virtual machine (VM) and it’s possible for many of them to be running on the same physical computer, sharing its resources. 

Article by Simon Challinor
Photo by Bradley Hook from Pexels

Each VM sees itself as a separate isolated computer with its own operating system and file system and potentially its IP address on a network or the Internet.

Whilst VMs are a relatively mature area of computing, in recent years a new type of technology “Containers”, that uses similar concepts to VMs, has arrived.  Docker is one such example. 
Historically VMs have been quite heavy in terms of their requirements but with the introduction of Docker Containers we can run many of these virtual computers within the same server or even on your mac book. 

What does that really mean for us?

 An example would be running two or three other operating systems (perhaps different versions of Linux) simultaneously inside the mac OS on your laptop. Each container is a separate isolated computer environment and each container installs just the software it needs for it’s own task.  One container may be a webserver designed to serve webpages, one may be optimised to run a particular database and one may needed to run regular background processes such as backups.

Why can’t we do that all on just one computer? 

This can of course all be done on a single, regular (physical) computer but the advantage of virtual containers is that each one is a virtual snapshot of a computer (stored in binary file) that can be started, stopped or moved to another computer as, and when, we need them.  In other words the computer and all of its setup can be running (or not) anywhere we wish.

The entire concept of a computer is now abstract and separate from any one physical device.

The physical hardware is not a limitation. At 12pm it’s running on my mac book pro in Hong Kong but 5 minutes later it can be running somewhere in the cloud on a physical server based in London. Later we may need more power so 3 new instances of the same computer could be spawned in other locations around the world.  

When the virtual computer is not running the sum total of its existence is effectively just binary (zeros and ones) so it can be copied, transferred and restarted elsewhere.

We would have confidence that there would be absolutely no difference in the way the container operates. 
The concept of a computer has itself become virtual in nature independent and indifferent to the physical memory or the CPUs it uses.  Hardware is disposable and abstracted away from the logic and purpose of the computer.