Learning - True Sales Results https://truesalesresults.com Thu, 07 Dec 2023 19:32:54 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.3 https://truesalesresults.com/wp-content/uploads/2023/10/cropped-TSR_FavIocn-32x32.png Learning - True Sales Results https://truesalesresults.com 32 32 Continuous Sales Learning https://truesalesresults.com/continuous-sales-learning/ Thu, 22 Jun 2023 19:23:49 +0000 https://truesalesresults.com/?p=117 What is Continuous Sales Learning? Why is it important? After 30+ years in Technology Sales, there is an opus piece of work that I still need to complete.

The post Continuous Sales Learning first appeared on True Sales Results.

]]>
What is Continuous Sales Learning? Why is it important? After 30+ years in Technology Sales, there is an opus piece of work that I still need to complete. It will be the legacy that I leave behind for the Complex B2B Sales SaaS community at large. Including my two young adult children who have decided to follow in my crazy career footsteps as Software Sales professionals.

You’re not supposed to have favorite customers as a Consultant. But I’d be lying if I did not admit that I absolutely do have favorite customers. Two of my all time favorite customers share something in common. They both are perpetual students of the game (e.g., the Complex B2B Sales profession), yet are both world class masters of the game from a leadership and execution perspective.

That is extremely rare. To be so accomplished in terms of generating massive amounts of revenue and shareholder/investor wealth over the course of their respective careers. But still retain that elusive humility to acknowledge that there is always room to learn new and better ways and improve their execution as executive leaders (C-Suite) in Technology companies.

Our shared collective professional passion is to build and lead a Continuous Sales Learning culture in their sales organizations. We have collaborated on designing and building out a Continuous Sales Learning platform and program to deliver this. This is beyond sales enablement and an onboarding program. This is different than just experiential Sales Bootcamp Training and great Sales Playbooks.

This is all about institutionalizing Continuous Sales Learning into the very fabric of your sales culture. This is the foundational pillar to building and scaling a world class, high performing sales organization that attracts, develops and retains the top sales and sales leadership talent out there. Sales becomes your distinct competitive advantage.

This is the first in a series of posts that I’ll be writing on this topic that is so near and dear to me. Imagine a world where anyone of your sales team members at anytime could respond to a bot prompt asking; “What do you want to learn today?”. How impactful would that be?

The post Continuous Sales Learning first appeared on True Sales Results.

]]>
Lessons in Competitive Positioning https://truesalesresults.com/lessons-in-competitive-positioning/ Thu, 13 Apr 2023 19:34:31 +0000 https://truesalesresults.com/?p=124 I was employee #10 and working for a Boston based software company that developed the first Applicant Tracking System (ATS) that scanned resumes and created a digital searchable version of the resume.

The post Lessons in Competitive Positioning first appeared on True Sales Results.

]]>
Many years ago, I learned an absolutely invaluable lesson in competition. I was employee #10 and working for a Boston based software company that developed the first Applicant Tracking System (ATS) that scanned resumes and created a digital searchable version of the resume. The company was Restrac.

This allowed recruiters and hiring managers to build searches against their open job requisitions to find the best candidates based on the skill keywords, educational background, job titles and experience in the digital resumes.

Our company co-founder was the guy who pioneered the concept of Contract Recruiting. He built out a contract recruiting agency across the US armed with contract recruiters with specialized recruiting experience (i.e., all the various subcategories in engineering such as electrical, mechanical, civil, software development, etc.).

They won multiple huge multi-year contract recruiting contracts with the big defense contractors who had won huge US defense contracts and needed to staff up with thousands of specialized engineers in a very short period of time. They were dealing with hundreds of thousands of resumes per year per contract customer.

Our founder thought that there has to be a better way than to manually code a resume for skill words, educational background and job experience in a homegrown database. Which was considered cutting edge at the time.

Being first to market does have its advantages in that we were growing fast over the first couple of years competing against some low end competitors. Then a new, formidable competitor from Silicon Valley arrived on the scene. They were based in Silicon Valley and were VC backed by Kleiner Perkins.

Their founder was a technologist that as an engineering hiring manager found the entire recruiting process to be highly manual and grossly inefficient. They developed an algorithm that automatically matched scanned digital resumes against open requisitions. The competitive company was Resumix.

They touted their “Patent Pending Artificial Intelligence”. Remember that this was back in the mid 1990s. AI was not even a thing back then. Their value prop and competitive positioning against us was speed. They could scan resumes faster than us and they could automatically match the best resumes for a job against an open requisition faster than us.

We would battle head to head with Resumix in all of the enterprise deals with F1000 companies that were receiving hundreds of thousands of resumes per year and filling thousands of highly technical jobs per year. We were really the only two options for these enterprise customers to consider and use.

Both companies were growing and prospering. I was learning how to sell enterprise class software deals. Then Resumix started to beat us. They were laying down the gauntlet and challenging us to head to head bake offs in a competitive customer’s headquarters. They did an excellent job of influencing the decision criteria (e.g., MEDDIC) and convincing the customer that speed and time to fill were the most important factors that they needed to consider.

For the first time, we were losing to Resumix more than we were beating them in the largest enterprise competitive deals. It was painful and I’ve always been a really sore loser. We needed to find a way to turn the tables and seed/influence the enterprise deal decision criteria to our solution’s unique differentiator. I learned how to crack the code successfully through a brutally competitive sales cycle at Intel against Resumix.

This was your classic enterprise sales cycle. It was over a year in length. There was an RFP that was ridiculously onerous and long. Intel is an engineering centric organization. Suffice to say that this was an excruciatingly detail oriented, data driven evaluation and decision process. It all culminated in a two (2) week head to head bake off in Intel’s corporate headquarters in Santa Clara, CA.

The rules and scoring rubric document given to both vendors were quite explicit and long. Intel wanted this bake off evaluation to be 100% objective and provide a level playing field. Neither vendor/sales team was allowed to touch the actual computer keyboard that the Intel HR employees were using during the bake off to simulate how they would do their jobs using both of our software systems.

Intel would scan the same resumes into both software systems. They the Intel recruiters would perform searches and matches against their most difficult to fill engineering job requisitions. Then Intel would compare the search/matching results against the short list of candidates that they interview for those actual jobs and the people that they had actually hired for those jobs.

Needless to say that we decisively won the Bake Off against Resumix. What we learned is that search accuracy and quality of search results were way more important than speed. And our search engine did a much better job of finding new and esoteric skills that they wanted to hire against for their next generation semiconductor engineering jobs.

This exposed the fatal flaw in Resumix’s proprietary algorithm and so called competitive advantage against us. We indexed and were able to search the entire text of the scanned digital resume. Whereas Resumix had a black box skill lexicon table that they would use to tag and match resumes against open job requisitions.

What we learned is that the Resumix approach was fundamentally flawed in that if the skill words on a resume at the time they were being scanned into the system did not exist in their black box lexicon, there was no way to retrieve or match them because in the interest of speed they only searched/matched against a small subset of the entire resume.

We now had a competitive blueprint for success against Resumix. We became experts at seeding/influencing the decision criteria on all of the enterprise deals over the next few years. We would ask questions such: as “Do you ever anticipate having a need to search for skill words on the resume that were not considered important at the time of being scanned?” “How important is having the ability to search for any keyword on a resume at any time and being able to find that candidate?” “Do you know at the point of scanning in resumes all of the keywords that you might use to search/retrieve that resume at a later date?” “How would that impact your confidence that you are getting the best, most accurate resume search results?” “Is search accuracy more important than the how fast you get search results?

This culminated in us crushing Resumix in head to head Bake Offs and ultimately allowed us to successfully IPO.

Good selling!

The post Lessons in Competitive Positioning first appeared on True Sales Results.

]]>
Flawed Lecturers (aka: Beware the Charlatan) https://truesalesresults.com/flawed-lecturers-aka-beware-the-charlatan/ Wed, 21 Aug 2019 20:27:00 +0000 https://truesalesresults.com/?p=158 Open disclaimer: I’m a research nerd. Absolutely love it and always have. I consider myself to be a perpetual student of “the game”.

The post Flawed Lecturers (aka: Beware the Charlatan) first appeared on True Sales Results.

]]>
The great thing about Social Media is that everyone can easily publish their thoughts and opinions. And the worst thing about Social Media is that everyone can easily publish their thoughts and opinions. Anyone can now conduct “research” and publish their “findings” on the internet. It’s ubiquitous for the masses. Yahoo!

Open disclaimer: I’m a research nerd. Absolutely love it and always have. I consider myself to be a perpetual student of “the game”. In my case, “the game” is complex B2B technology sales. At the risk of dating myself, we use to refer to it as good old fashioned enterprise selling. Selling high priced software solutions to a group of stakeholders or a buying committee in a large enterprise company.

What is the intersection of my love for research and learning, with the fact that anyone can publish anything at anytime from anywhere? Simply put, there is a new class of lecturers out there on the internet and Social Media. Millions, if not tens of millions of them. And it’s growing every day. They are the self-proclaimed thought leaders. They heap unwarranted praise on each other and refer to themselves as “rock stars”.

They tag scores of their rock star friends in every post and invite their opinions. They hash tag the shit out of Silicon Valley’s obnoxious acronym and buzzword vernacular. But that’s not what annoys me the most, that’s just mildly irritating. What actually pisses me off is that they are factually wrong. The insights, advice and research findings they are sharing are actually subjective opinions. They are an unproven hypothesis. And more often than not, they are simply wrong.

Going back to my college days at Northeastern University in Boston, Massachusetts where I learned Statistics I and Statistics II under the great leadership of Professor Chakraborty. I’m not quite sure how he did this, but Professor Chakraborty made learning Statistics fun. He used to tell us that for any research study findings to be considered statistically significant, it had to pass specific mathematical criteria.

For those inclined to geek out on this, it’s called statistical hypothesis testing. It’s defined as follows: Statistical hypothesis testing is used to determine whether the result of a data set is statistically significant. This test provides a p-value, representing the probability that random chance could explain the result. In general, a p-value of 5% or lower is considered to be statistically significant.

The sample size and target population used in the research has a direct correlation on whether or not your findings are statistically relevant or not (just a hypothesis). That was a long winded way of getting to the crux of what I mean when I refer to a “flawed lecturer”. A flawed lecturer does not understand the math behind statistics or simply ignores it and publishes their hypotheses as facts.

And whenever there is an underlying error in your math formula, any answer that comes out of that formula is inherently wrong or flawed. My admonition to everyone is to challenge the underlying math in any research study findings that are published on the web or Social Media. Question the author and validate that they used proper math in determining their sample size and target population to conduct research against their hypothesis.

I worked for a Marketing Research firm in Boston for a couple of years while in college. We ran extensive focus groups and conducted large research studies for big corporations that sold to consumers. What I learned is that it is quite easy to reverse engineer a survey result or outcome that your customer wants to see. In fact, it is quite common. When you see a TV commercial saying that 4 out 5 dentists recommend a particular brand of toothpaste, do you ask yourself how many dentists were surveyed? Do you ask yourself if the dentists surveyed were given free samples to give out to their patients by that particular toothpaste brand manufacturer? I could go on all day here, but you probably get the point by now.

I’m don’t believe in “trolling” or publicly shaming anyone. Either in person or hiding like a coward through anonymous web trolling. Allow me to share a generic example of flawed lecturers. There are a number of sales technology tool companies out there that cite “data findings” from their platform as factual. One such blog post author cites the fact that “sales discovery” can be counter-productive in certain situations. It went on to say that based on hundreds of thousands of sales calls analyzed, there was an optimal number of discovery questions asked that resulted in a successful sales call outcome. There was additional color commentary that stated as a fact that unsuccessful sales call outcomes asked twice as many discovery questions as the successful sales calls did.

Where is the proverbial fly in the ointment here? No one is challenging the sample size or sample population. Due to the scale (hundreds of thousands of call) of the data, people are simply accepting the findings as facts. Now let me pick apart this flawed hypothesis. The fact is that their tool is almost exclusively used by very inexperienced sales reps (junior inside sales reps). In fact, many of their users are brand new to sales (Sales Development Reps (SDRs) or Business Development Reps (BDRs)).

People that are new to sales or very inexperienced to sales have not had much sales training by definition. Inexperienced sales reps commonly struggle in formulating and asking good discovery questions. As a result of this, they also tend to overcompensate by asking too many sales discovery questions. Or they simply ask bad discovery questions that lead to unsuccessful sales call outcomes. It’s easy to turn off a customer by asking what they consider to be a stupid question.

Customers become offended when they realize through your discovery questioning that you didn’t do your research prior to the sales call. This inherently correlates to asking too many sales discovery questions which in turn reveals to the customer that the sales rep did not even bother to research the customer’s web site to learn what they do and ends up as an unsuccessful sales call outcome.

Are you sensing a trend here? All of their data findings presented as facts can be attributed to other inherent factors in their sample population. The hypothesis should be what are the right sales discovery questions to ask, not the right number. What is the correlation to the research and preparation the sales rep does prior to the call have on the number of sales discovery questions they ask and the corresponding outcome? What is the correlation to the experience and sales training the sales rep has had to the number of discovery questions they ask and the outcome of the call?

Remember, given the newness to sales that this companies users typically are…a good portion of these folks will fail and were never meant to be in sales in the first place. The root causes that should be looked at are:

  • Why did we hire these folks who fail in the first place?
  • What can we do from a sales training and sales coaching perspective to prevent them from failing?
  • How can we teach newer sales reps to ask smarter and more effective sales discovery questions?
  • How can we help newer sales reps learn the importance of doing their research and preparation prior to a sales call so you don’t struggle and ask either the wrong discovery questions, too many discovery questions or offend the customer?
  • How can we help newer sales reps learn the importance of deeply understanding the different stakeholder types we sell to and how to tailor their discovery questions to be relevant to each type?

In closing, just because a flawed lecturer cites hundreds of thousands or millions of data points in their research findings, it does not mean it’s the right data set or that they have drawn factual conclusions from the data.

The post Flawed Lecturers (aka: Beware the Charlatan) first appeared on True Sales Results.

]]>
The Context Behind the Data https://truesalesresults.com/the-context-behind-the-data/ Sun, 29 Mar 2015 00:00:00 +0000 https://sharpwilkinson.com/tsr/the-context-behind-the-data/ All too frequently, I see business leaders completely absorbed in spreadsheets and reports. Percentages, conversion ratios and growth rates are bandied about liberally.

The post The Context Behind the Data first appeared on True Sales Results.

]]>
Metrics are good. They can be valuable and you can glean insight from them. But my admonition to all is that you proceed with caution when it comes to using metrics and data to make business decisions. All too frequently, I see business leaders completely absorbed in spreadsheets and reports. Percentages, conversion ratios and growth rates are bandied about liberally. Unfortunately, what’s commonly lost in translation in this exercise is the context behind the data.

Context doesn’t exist in a spreadsheet cell. It doesn’t appear as a footnote in a Salesforce.com report. Context is nowhere to be found in benchmarking metrics. Yet context is the most vital factor in every formula, because without it, you are operating in a vacuum. You are making decisions devoid of the most important dimension, which is what I refer to as ‘business reality’.

Harkening back to my college days, I was struck by what my Statistics II professor would always tell us; “You can make statistics say whatever you want them to. You can make statistics lie.” I recall being very confused by that notion and approached him after class to ask him what he meant by that. He told me that all statistics are predicated on samples. And that its quite easy and common practice to manipulate the samples to support whatever assertion you are trying to prove with metrics that do. He further explained that statistics and metrics are used and cited as the basis for making virtually every decision made in the world, yet no one bothers to question the samples used to generate the statistics in the first place. He ended the lesson by saying that the vast majority of statistics are not accurate because the samples used were not statistically relevant in the first place.

This lesson and business reality was further driven home with me when I worked for a large, prestigious market research firm in Boston during college. I ended up becoming a manager and got involved in the creation of surveys and focus groups that we would use on behalf of our clients. All of the market research was supposed to be objective and accurate. Our clients would pay us big bucks to conduct the market research and they were mostly blue chip Fortune 1000 companies that we were working with. Here’s the dirty little secret, we tailored the surveys to elicit the responses that were most favorable to our client. We doctored the results by ‘selectively’ including in the survey results the responses that our clients were looking for, rather than all of the survey results that would skew the outcome and possibly show our client in an unfavorable light with their customers or markets. This was done intentionally and as I learned, completely common in the industry.

It would be analogous to looking for a doctor that would only tell you that you look great, don’t need to lose any weight and shouldn’t cut back on your caffeine or alcohol consumption. Everyone laughs at how absurd that sounds but most people prefer the Hollywood ending rather than the reality of the world. It’s no different in the business world. I’ve been in sales and sales management for over 20 years in my career and it never ceases to amaze me how many people don’t look for the context or meaning behind the data and metrics. Most seasoned sales managers simply have a visceral sense when the data or metrics are wrong and most tend to be proactive about getting the context before citing or using any data for decision purposes.

How do you get the context behind the data? You ask smart questions to the people that are creating the data in your systems of record like Salesforce.com. Is ‘all’ the data being entered in to your systems of record? Is it being entered objectively or subjectively? Are there things happening in the real world that can explain vitally important trends that simply aren’t captured in the data or metrics? Are the system of record rules of engagement well understood by everyone that uses them and are they applied consistently? My experience shows that most sales teams and companies do a poor job in this area, and yet they are the first ones to cite reports, metrics, stats and reference spreadsheet formulas as to how their business is doing.

It’s like the old United Airlines commercial where the company has lost touch with their market and customers. The grey haired CEO passes out airline tickets to all of the execs around the conference room table and tells them to get out there and visit their customers. Don’t lose touch with the real world; you need to talk to your field sales people and customers to find out what is really happening out there. Don’t get lost in spreadsheets and metrics because without the context or meaning behind the data, you most likely have flawed assumptions that you are basing your business decisions on.

We help companies improve their sales conversion rates or effectiveness. As part of any audit engagement, we look at both the data and the context behind the data. Typically, we glean far more insight from conversations with sales reps and customers then we do from the Salesforce.com reports or spreadsheets. In fact, we commonly find embarrassing flaws in the metrics that are being used to make the business decisions through the context. So get out there and find the context behind the data, you’ll be making informed decisions when you do!

The post The Context Behind the Data first appeared on True Sales Results.

]]>