FILOSOFÍA del SACRIFICIO (Vídeo de Motivación) – ¿Hasta Dónde MERECE la PENA seguir LUCHANDO?

FILOSOFÍA del SACRIFICIO (Vídeo de Motivación) - ¿Hasta Dónde MERECE la PENA seguir LUCHANDO?

"Filosofía del sacrificio" es un vídeo de motivación que pretende dotar al espectador de aquellas competencias económicas, psicológicas y filosóficas necesarias para incurrir en proyectos en los que es necesario renunciar a la gratificación presente para alcanzar una meta de mayor valor en el futuro. Ya seas un deportista de élite, un estudiante de medicina o un emprendedor digital, este discurso será de gran ayuda en tu progreso.
🎙️ Apoya el canal ► https://www.patreon.com/ramtalks
🤝 También puedes apoyar el canal vía:
Paypal: paypal.me/ramtalks
Bitcoin: 1J8tdVXCNGRmhX1AXE43ELrocN3f41kd47
Ethereum: 1LuaK9b8hRf9w4a8dwgYxhStvH9rWX6g2v
📖 Libros recomendados acerca de la personalidad ►
https://amzn.to/2FUTqtC
https://amzn.to/34noPOT
¡Eh! Hago un directo semanal en mi página de IG:
💣 Instagram ► https://www.instagram.com/ramtalks/
👨🏻‍💻 Edición ► https://www.instagram.com/nejemiprodu…
No te pierdas mis otras redes sociales:
🎱 Tik Tok ► @ramtalks
💬 Twitter ► https://twitter.com/RamTalksYT
♣️ Facebook ► https://www.facebook.com/RamTalksYT
Este es el material que recomiendo para grabar vídeos:
📹 Cámara: https://amzn.to/32fu0h0
📽️ Lente: https://amzn.to/2PgfGzy
🎙️ Micrófono: https://amzn.to/37JTOmE
🖥️ Ordenador: https://amzn.to/3bWPsvK

La fuerza de voluntad, la predisposición a la estabilidad emocional, la planificación de las tareas o la pasión por la materia son de suma relevancia a la hora de predecir la conducta de los residentes, especialmente si estos factores se evalúan en conjunto y albergan un componente genético considerable. Pero, a grandes rasgos, la mecánica del autocontrol y la persecución de objetivos también es sustentada por un buen sistema de pensamientos que dote al individuo de recursos útiles en la práctica. De todos modos, mi cometido a la postre en el presente corto-documental es examinar desde un punto de vista cognitivo-conductual y ético la filosofía del sacrificio, poniendo en tela de juicio dogmas del estilo “todo esfuerzo tiene su recompensa” o “el dolor es temporal, pero la satisfacción es permanente”.

Como bien remarcó el eminente Daniel Kahneman en “Pensar rápido, pensar despacio”: “a quienes deseen iniciar un designio particular, les recomendaría que no perdieran de vista ciertos detalles importantes: ¿estoy siendo demasiado optimista con los resultados? De ser así, ¿cómo podría diseñar medidas para mitigar posibles sobrestimaciones en mis predicciones? ¿existen eventos azarosos que podrían menoscabar mi plan? ¿necesito de verdad tantos recursos para empezar, o más bien me estoy quedando corto? ¿cuál es el status-quo del que parto? ¿es esta situación inicial favorable o desfavorable? En caso de estar atravesando algún conflicto personal, ¿de qué manera podría estar reflejándose en algún aspecto de la elaboración de mi plan? ¿afectaría en algo a su desempeño?”.

How To Create An Online Course That SELLS (From A 9 Figure Course Creator!)

How To Create An Online Course That SELLS (From A 9 Figure Course Creator!)

How to create an online course that SELLS (from a 9-figure course creator!). Founder of Mindvalley, Vishen Lakhiani, reveals the 5 elements to create online courses that transform people’s lives, and how you can generate over 6 figures in your coaching business.

Key Highlights:
0:00 The Power of Transformation
2:22 How To Generate 6 Figures
7:58 Element #1
10:27 Element #2
13:51 Element #3
17:42 Element #4
20:44 Element #5

Sign up for your Free Trial of our highly-rated program Coaching Mastery here: http://go.evercoach.com/uedYFaU2

Love this video? Subscribe to our channel for your weekly dose of learning, growth, and fun! We release new videos like this every Thursday, you won’t want to miss them 😉

#OnlineCourse #VishenLakhiani #Transformation

The 10 Most Annoying Phrases Said During Video Calls

Love them or loathe them, with the work-from-home era showing no sign of abating, conference calls are here to stay.

Video calls have become increasingly prevalent throughout the pandemic, enabling teams to conduct business remotely. It’s important meeting attendees present themselves in as professional light as possible during remote meetings.

Despite the need for professionality on conferencing calls, an awkward moment is never far away.



Most Annoying Phrases on Video Calls

Easy Offices, providers of flexible, serviced office space, researched the most annoying phases said during video calls.

The survey asked 1,000 remote workers their biggest dislikes and most irritating phrases when participating in conference calls.

Check out our top ten. You never know, it might help you avoid annoying colleagues or clients by making such phases yourself!

“I need to jump on another call.”

Its stands to reason why someone stating they need to “jump on another call” might be deemed as rude.

“You’re on mute!”

If you’re on mute, you’re likely to know you’re on mute and don’t need to be told so.

“We lost you for a minute there.”

Again, stating the obvious on remote calls is a tad patronizing and irritating.

“Do we have everyone here?”

A little resonant of a teacher addressing a kindergarten class…

“Can you see my screen?”

Another question that’s commonly used and as equally annoying on video calls!

“Can everyone mute themselves please?”

Another one that’s a little on the patronizing side!

“Let’s take this offline.”

Or couldn’t we just extend the video meeting and get the issue done and dusted so we don’t need to take it offline?

“Conscious there’s only x minutes left …”

If there are only several minutes left, why not wrap up the meeting in a proactive way rather than dwell on the fact?

“I’m getting really bad feedback.”

Disruptions caused by feedback are annoying so there’s no need to spell it out to the rest of the meeting.

“I’ve got a hard stop at x o’clock”

In other words, we need to end the meeting whether we’ve finalized details or not. Wouldn’t it be more valuable to extend the meeting or arrange another one to ensure nothing has been left out?

Image: Depositphotos.com


How to Handle the “Living at Work” Lifestyle

The COVID pandemic for small businesses is now in its sixth month with no signs of abating. Even before the pandemic, small business employees and employers were dealing with the effects of an increasingly complex work-life reality; COVID-19 has driven a “living at work” lifestyle that is making this balance more difficult.

On the Small Business Radio Show this week, Jessica Moser, Senior Vice President, Small Business Solutions at MetLife discuss this paradox and the insight from their monthly Small Business Coronavirus Impact Poll.

Jessica Moser of MetLife on Living at Work

Jessica discusses that during the pandemic, work/life boundaries have become blurred because is it hard to tell when work starts and stops. She adds that “many employees feel they are on all the time.” In MetLife’s monthly poll, they found that 42% of small biz employers and employees say they struggle to navigate the demands of an “always-on” work/life world . In addition, 49% of employers cited burnout as a top concern (up significantly from 37% in 2019).

According to Jessica, the best way to navigate the always on culture is to listen to what your employees specific pain points are; “is it too many emails or Zoom calls? Are they feeling too isolated? This is how you start to create solutions. Remember, flexibility and support has always been a key differentiator for working at a small business.”

In the latest poll, MetLife found that small business employees are more worried about their financial health (55%) than their physical health (44%), a startling statistic in the midst of a global pandemic. Two thirds of employees are concerned their business will have to close again and 55% of small business leaders think it is going to take six more months to get to normal.

As Jessica points out, “if they are financially worried it’s hard to be productive.” She suggests that small businesses look at voluntary benefit options that provide security and flexibility to show your company cares about its staff.

According to the poll, employees value voluntary benefits even if they have to pay for them. 46% small business employees are interested in a wider array of benefits like dental, basic life insurance, and legal plans. Jessica believes it is critical for small business owners to communicate the value of all employee benefits since according to their survey, people understand their benefits feel better holistically.

Listen to the entire interview on the Small Business Radio Show.

READ MORE:


The Enterprise Systems That Companies Need to Create

Topics

Building a Winning Data Strategy

Building a winning data strategy requires bold moves and new ideas. Creating a strong data foundation within the organization and putting a premium on nontechnical factors such as analytical agility and culture can help companies stay ahead. This MIT SMR Executive Guide, published as a series over three weeks, offers insights on how companies can move forward with data in an era of constant change.

Brought to you by

See All Articles in This Series

The swiftness of technology’s progress in the past decade has convinced legions of companies that their survival depends on jettisoning their legacy systems as soon as budgets permit such an overhaul. Computing power has surged, storage costs have plummeted, and networking speeds have approached theoretical limits. All the while, companies and consumers are generating ever-growing floods of data packed with clues on how individuals behave and how products perform. Many companies thus risk upgrading technology purely for its own sake. In doing so, they overlook what may be the greatest opportunity presented by the modern technology stack: the chance to mobilize new tools in a way that empowers managers and technologists alike to make fundamentally better business decisions.

To illustrate, consider the curiously old-fashioned approach companies typically take to upgrading their legacy systems. It starts when something old stops working. Perhaps an aging mainframe fails often, resulting in seemingly never-ending maintenance costs, or an outage destroys transaction data. A decade ago, the natural response was to check for an updated version of yesterday’s software that could stamp out the bugs — particularly if paired with newer hardware. Now, companies look to the cloud for the latest collection of computing services, storage technologies, and performance guarantees.

Both scenarios share the same mentality and prevailing aim: to replicate yesterday’s functionality at today’s prices. “I want what I have today, only faster, or cheaper, or simpler.” The habit of colossal, periodic technology projects persists, justified by often-strained business cases that hinge on cost improvements or risk reductions spread across a wide swath of deeply entrenched systems.

There is a better way. Tech upgrades can be revenue generators, not just cost sinks, and they need not saddle you with soon-to-be legacy burdens. Our experience suggests that three strategies can position companies to carry out technology transformations that can create value and enable continuous innovation.

Redefine success. Companies that reap the greatest rewards from technical improvements recognize that it’s not only technology that changes: It’s also their leaders’ minds, priorities, and circumstances. Legacy systems aren’t bad because they’re outmoded — they’re bad because they’re almost invariably hard to deprecate.

To skillfully keep pace with technology, companies therefore need to develop what we call second derivative thinking: They must work to increase the rate of change of change. To build systems that improve the velocity of change in practice, companies need to identify the structural impediments that act as brakes on their ability to deliver technically, and then insist that each change project aims to remove at least one of those obstacles. In addition to achieving the project’s immediate goals, the effort also clears roadblocks that would otherwise bog down future efforts. With fewer impediments, subsequent projects automatically accelerate. And because those individual solutions are delivered in concert with existing work, teams don’t need to contrive laborious business cases to address them in isolation.

Take the example of a bank that wanted to grow by expanding its geographic footprint. Many aspects of banking vary from one country to the next — regulations and requirements, consumer habits and preferences. Idiosyncrasies aside, however, much remains the same about the core proposition and the span of products and services; in every nation, people save, spend, and borrow. Rather than inventing new infrastructure to conform to the nuances of a given location or region, the bank instead sought to leverage a common systems core where possible and then engineered custom services that could be “swapped in” to meet the particular needs of any given area.

Because the bank set the goal of being able to rapidly deploy operations in a new geography, it had no choice but to engineer the ability to rapidly adapt its systems. If a new regulation in, say, Singapore alters local banks’ identity verification requirements, the bank can update that single, isolated service rather than try to retool core banking applications in their entirety.

Of course, not every company has the time or wherewithal to build cloud-native applications from scratch. Many sit in a technical limbo, unable to reach for state-of-the-art tech but unwilling to hobble along with turn-of-the-millennium tools. To resolve that dilemma, companies often try to build data lakes as a stepping stone to more comprehensive system upgrades. They reason that if they can systematically hoover up the information stored in those source systems and grant more widespread access to it, they can create more modern applications while letting older systems lumber along in the background. This tactic can work, but it comes with a cost.

To pipe data out of legacy systems reliably, you have to build and maintain reliable pipelines. With a handful of systems and sources, this isn’t hard. But multiply that across the vast array of systems inside most large organizations and you get what has been called pipeline jungles — thickets of expensive data-integration jobs that no one really owns. Anyone who’s ever written or read a service-level agreement knows that a collection of individually strong components can produce a brittle system.

A novel solution to this problem is emerging in the form of data virtualization: a logical data layer that unifies data siloed in disparate systems without actually physically integrating it. At first, the idea sounds a bit silly. Don’t actually pull all of your data together into a communal, capacious tank. Instead, use technology that lets you pretend that you did. Rather than yanking data out of systems, you reach into them and fetch information when you need to use it.

Accessing data in situ, rather than creating infrastructure to shuffle it around, offers a few benefits. You’ll reduce unneeded copying and the wasteful expense of storing duplicate data. You’ll use tools that give you a single route to reach upstream data, avoiding pipeline jungles to wade through. And by combining these tools with gateways and intelligently constructed interfaces, you can implement detailed permissions and security protocols far more easily.

Orient technology around decision-making. To create value from investing in technology, companies need to be clear on where the value lies in the first place. Companies can’t function without any technology, but surely the goal of better systems should be to function more effectively. In business, this imperative boils down to the goal of making better decisions. Generally, businesses don’t make money by chance but by choice. Behind every value-creating action, there is a long string of decisions about how, when, and by whom related tasks need to get done. For instance, long before a bank collects payments on a loan, it first needs to identify target customers, devise ways to solicit them, estimate their creditworthiness, create the product, set a rate, originate the loan, and ensure that the systems are ready to service it.

In general, companies should strive to make high-stakes decisions more effective and low-stakes ones more efficient. One national retailer noticed that when its senior-most executives gathered for quarterly planning meetings, they spent nearly all of their time scrutinizing historical sales reports and almost none of it making strategic choices. Why? Culture played a part — dwelling on anomalies and nitpicking was a well-known habit. (“Retail is detail,” as many say.) But there was a deeper, technological failing too.

Over time, the company had steadily improved its reporting capabilities to let users see fresher, more granular information. In contrast, it hadn’t invested in technologies to make predictions or to model scenarios. Users could pinpoint the store that sold the most blueberry yogurt yesterday. But ask them which flavors customers would buy if the blueberry were removed, and they’d have no idea. Ask them whether they should invest to build more stores or to reduce prices, and you’d hear a similar silence.

The executives hatched a bold plan to reorient themselves. They resolved to create what they called “living, 18-month plans” that codified their strategic choices — for instance, levels of discounting or footprint expansion — as well as robust forecasts of expected performance. During quarterly meetings, leaders updated the plan. What’s more, they insisted on the ability to quantify the business impacts of varying those choices as interactively as they could. The company needed to redirect investments away from reporting and toward modeling — fewer dashboards and more decision-support tools. But tailoring their technology to better suit the company’s key decisions meant that the leadership team could focus its collective attention on the choices that mattered most.

Enhancing low-stakes decisions can generate sizable profits too, if done frequently or on a large scale. A grocery store chain spent years agonizing over an elaborate, item-level pricing system. To work, it needed lots of data. Some inputs should have been simple to obtain, like statistics on past prices and sales, but analysts were stymied by having to collate data from different legacy systems. Other inputs, like competitive price data, were costly to buy and error-prone. To function, the system needed to build, store, and update tens of thousands of narrow statistical models, each one of which could fail from an input glitch or return nonsensical results because of an outlier. It was, in other words, a mess.

Dismayed by what he saw, the CEO asked a provocative question: Was “the juice worth the squeeze”? In other words, what if, instead of updating the complicated system, the retailer simply stopped setting item prices itself?

Instead of deploying large teams to chase down tiny details, the retailer could ask small teams of buyers and category managers to apply the same margin to all items in a category, effectively shifting the burden of setting product pricing back to the vendors. If a brand wished to undercut its rivals, it could do so by lowering its cost to the retailer — but that dynamic would require zero effort from the retailer. The retailer, in turn, could adjust its overall level of price competitiveness by altering margin targets across categories.

Though the idea struck some as heresy, it gained currency as its assumptions and implications became clear. It might be possible to measure differences in price sensitivities between two similar cans of beans, as pricing folklore suggests, but the evidence is weak. Hence, this kind of decision is likely to be a low-stakes one where the best way to win is to be more efficient. Automating such decisions often makes them more efficient; offloading them entirely always does so.

Your thinking may evolve substantially as you upgrade your systems, but working in small chunks makes pivoting far easier. You have room to change your mind.

Reduce your delivery “chunk size.” One final strategy that companies can use to good effect when revamping enterprise systems is to deliver smaller, self-contained, complete units of work. As a thought experiment, think of how much you’re willing to spend on your next tech transformation and how long you expect it to take. Divide both numbers by 50. You should aim to organize your work so that if you spend that diminished sum over that brief time interval, you get an entirely functional systems component, like a Lego brick that’s yours to build with. Maybe it’s a service encapsulated in a container, or a set of interfaces for accessing data programmatically — but whatever you build, you want it to be feature-complete and immediately reusable.

Operating in this way embeds the kind of second derivative thinking described above. Your thinking may evolve substantially as you upgrade your systems, but working in small chunks makes pivoting far easier. You have room to change your mind.

Furthermore, teams can use and derive value from individual components as they come online, rather than waiting for the entire edifice to be built. And giving teams new technical possibilities is a great way to unleash their creativity and empower them to solve problems you may not have considered.

The experience of a large commercial insurer provides an example of how to put these ideas into practice. Like many of its peers, this company felt both the drag of legacy systems and the fear of taking them off life support. Rather than rebuilding them in one go, the company chose to isolate its internal systems by putting legacy applications in a distinct service tier. Then it built a separate services layer on top of those applications. Those services provided users with access to the data and functions of the underlying legacy systems while abstracting away the need to interface with them directly. Legacy systems were in effect shrink-wrapped for a longer shelf life, and more legacy systems could be moved into the first, internal tier one by one.

To move a system behind this services layer, teams needed to think carefully about which of its data elements and functions were critical to the company’s operations. Not only did the exercise allow them to incrementally disentangle their tangled legacy setup; it also gave them a road map for progressively building more modern applications.

Leave an Agile Legacy

You don’t have to be a technical expert to be astonished by the increasing potential for technologies to transform organizations. Fifty-one years ago, the United States piloted a spaceship to the surface of the moon using a 70-pound computer that could perform 14,245 calculations per second. In September of this year, Nvidia introduced a graphics processor that is more than 2.5 billion times as fast, weighing little more than a book.

Currently, high-capacity hard drives can store data at a cost of just over $15 per terabyte. When we first walked on the moon, storing that terabyte would have cost $1.7 billion in today’s dollars. This August, researchers pushed data through a single fiber-optic cable at a rate of 178 terabits per second — enough to transfer about 1,500 4K movies in the time it takes to say “one, Mississippi.” When the Apollo 11 astronauts splashed back down to Earth, the first computer-to-computer link hadn’t yet been invented. (It would come three months later, with Arpanet.)

Given the giant leaps forward that technology continually makes, it’s now possible for companies to replace their legacy systems and aging computing platforms with systems that will enable them to realize breakthroughs in terms of greater efficiencies, but also with greater product and service offerings. The key is to recognize how your legacy systems must adapt in a world where accelerating change is the only constant.

Topics

Building a Winning Data Strategy

Building a winning data strategy requires bold moves and new ideas. Creating a strong data foundation within the organization and putting a premium on nontechnical factors such as analytical agility and culture can help companies stay ahead. This MIT SMR Executive Guide, published as a series over three weeks, offers insights on how companies can move forward with data in an era of constant change.

Brought to you by

See All Articles in This Series