FILOSOFÍA del SACRIFICIO (Vídeo de Motivación) – ¿Hasta Dónde MERECE la PENA seguir LUCHANDO?

FILOSOFÍA del SACRIFICIO (Vídeo de Motivación) - ¿Hasta Dónde MERECE la PENA seguir LUCHANDO?

"Filosofía del sacrificio" es un vídeo de motivación que pretende dotar al espectador de aquellas competencias económicas, psicológicas y filosóficas necesarias para incurrir en proyectos en los que es necesario renunciar a la gratificación presente para alcanzar una meta de mayor valor en el futuro. Ya seas un deportista de élite, un estudiante de medicina o un emprendedor digital, este discurso será de gran ayuda en tu progreso.
🎙️ Apoya el canal ► https://www.patreon.com/ramtalks
🤝 También puedes apoyar el canal vía:
Paypal: paypal.me/ramtalks
Bitcoin: 1J8tdVXCNGRmhX1AXE43ELrocN3f41kd47
Ethereum: 1LuaK9b8hRf9w4a8dwgYxhStvH9rWX6g2v
📖 Libros recomendados acerca de la personalidad ►
https://amzn.to/2FUTqtC
https://amzn.to/34noPOT
¡Eh! Hago un directo semanal en mi página de IG:
💣 Instagram ► https://www.instagram.com/ramtalks/
👨🏻‍💻 Edición ► https://www.instagram.com/nejemiprodu…
No te pierdas mis otras redes sociales:
🎱 Tik Tok ► @ramtalks
💬 Twitter ► https://twitter.com/RamTalksYT
♣️ Facebook ► https://www.facebook.com/RamTalksYT
Este es el material que recomiendo para grabar vídeos:
📹 Cámara: https://amzn.to/32fu0h0
📽️ Lente: https://amzn.to/2PgfGzy
🎙️ Micrófono: https://amzn.to/37JTOmE
🖥️ Ordenador: https://amzn.to/3bWPsvK

La fuerza de voluntad, la predisposición a la estabilidad emocional, la planificación de las tareas o la pasión por la materia son de suma relevancia a la hora de predecir la conducta de los residentes, especialmente si estos factores se evalúan en conjunto y albergan un componente genético considerable. Pero, a grandes rasgos, la mecánica del autocontrol y la persecución de objetivos también es sustentada por un buen sistema de pensamientos que dote al individuo de recursos útiles en la práctica. De todos modos, mi cometido a la postre en el presente corto-documental es examinar desde un punto de vista cognitivo-conductual y ético la filosofía del sacrificio, poniendo en tela de juicio dogmas del estilo “todo esfuerzo tiene su recompensa” o “el dolor es temporal, pero la satisfacción es permanente”.

Como bien remarcó el eminente Daniel Kahneman en “Pensar rápido, pensar despacio”: “a quienes deseen iniciar un designio particular, les recomendaría que no perdieran de vista ciertos detalles importantes: ¿estoy siendo demasiado optimista con los resultados? De ser así, ¿cómo podría diseñar medidas para mitigar posibles sobrestimaciones en mis predicciones? ¿existen eventos azarosos que podrían menoscabar mi plan? ¿necesito de verdad tantos recursos para empezar, o más bien me estoy quedando corto? ¿cuál es el status-quo del que parto? ¿es esta situación inicial favorable o desfavorable? En caso de estar atravesando algún conflicto personal, ¿de qué manera podría estar reflejándose en algún aspecto de la elaboración de mi plan? ¿afectaría en algo a su desempeño?”.

How To Create An Online Course That SELLS (From A 9 Figure Course Creator!)

How To Create An Online Course That SELLS (From A 9 Figure Course Creator!)

How to create an online course that SELLS (from a 9-figure course creator!). Founder of Mindvalley, Vishen Lakhiani, reveals the 5 elements to create online courses that transform people’s lives, and how you can generate over 6 figures in your coaching business.

Key Highlights:
0:00 The Power of Transformation
2:22 How To Generate 6 Figures
7:58 Element #1
10:27 Element #2
13:51 Element #3
17:42 Element #4
20:44 Element #5

Sign up for your Free Trial of our highly-rated program Coaching Mastery here: http://go.evercoach.com/uedYFaU2

Love this video? Subscribe to our channel for your weekly dose of learning, growth, and fun! We release new videos like this every Thursday, you won’t want to miss them 😉

#OnlineCourse #VishenLakhiani #Transformation

The Enterprise Systems That Companies Need to Create

Topics

Building a Winning Data Strategy

Building a winning data strategy requires bold moves and new ideas. Creating a strong data foundation within the organization and putting a premium on nontechnical factors such as analytical agility and culture can help companies stay ahead. This MIT SMR Executive Guide, published as a series over three weeks, offers insights on how companies can move forward with data in an era of constant change.

Brought to you by

See All Articles in This Series

The swiftness of technology’s progress in the past decade has convinced legions of companies that their survival depends on jettisoning their legacy systems as soon as budgets permit such an overhaul. Computing power has surged, storage costs have plummeted, and networking speeds have approached theoretical limits. All the while, companies and consumers are generating ever-growing floods of data packed with clues on how individuals behave and how products perform. Many companies thus risk upgrading technology purely for its own sake. In doing so, they overlook what may be the greatest opportunity presented by the modern technology stack: the chance to mobilize new tools in a way that empowers managers and technologists alike to make fundamentally better business decisions.

To illustrate, consider the curiously old-fashioned approach companies typically take to upgrading their legacy systems. It starts when something old stops working. Perhaps an aging mainframe fails often, resulting in seemingly never-ending maintenance costs, or an outage destroys transaction data. A decade ago, the natural response was to check for an updated version of yesterday’s software that could stamp out the bugs — particularly if paired with newer hardware. Now, companies look to the cloud for the latest collection of computing services, storage technologies, and performance guarantees.

Both scenarios share the same mentality and prevailing aim: to replicate yesterday’s functionality at today’s prices. “I want what I have today, only faster, or cheaper, or simpler.” The habit of colossal, periodic technology projects persists, justified by often-strained business cases that hinge on cost improvements or risk reductions spread across a wide swath of deeply entrenched systems.

There is a better way. Tech upgrades can be revenue generators, not just cost sinks, and they need not saddle you with soon-to-be legacy burdens. Our experience suggests that three strategies can position companies to carry out technology transformations that can create value and enable continuous innovation.

Redefine success. Companies that reap the greatest rewards from technical improvements recognize that it’s not only technology that changes: It’s also their leaders’ minds, priorities, and circumstances. Legacy systems aren’t bad because they’re outmoded — they’re bad because they’re almost invariably hard to deprecate.

To skillfully keep pace with technology, companies therefore need to develop what we call second derivative thinking: They must work to increase the rate of change of change. To build systems that improve the velocity of change in practice, companies need to identify the structural impediments that act as brakes on their ability to deliver technically, and then insist that each change project aims to remove at least one of those obstacles. In addition to achieving the project’s immediate goals, the effort also clears roadblocks that would otherwise bog down future efforts. With fewer impediments, subsequent projects automatically accelerate. And because those individual solutions are delivered in concert with existing work, teams don’t need to contrive laborious business cases to address them in isolation.

Take the example of a bank that wanted to grow by expanding its geographic footprint. Many aspects of banking vary from one country to the next — regulations and requirements, consumer habits and preferences. Idiosyncrasies aside, however, much remains the same about the core proposition and the span of products and services; in every nation, people save, spend, and borrow. Rather than inventing new infrastructure to conform to the nuances of a given location or region, the bank instead sought to leverage a common systems core where possible and then engineered custom services that could be “swapped in” to meet the particular needs of any given area.

Because the bank set the goal of being able to rapidly deploy operations in a new geography, it had no choice but to engineer the ability to rapidly adapt its systems. If a new regulation in, say, Singapore alters local banks’ identity verification requirements, the bank can update that single, isolated service rather than try to retool core banking applications in their entirety.

Of course, not every company has the time or wherewithal to build cloud-native applications from scratch. Many sit in a technical limbo, unable to reach for state-of-the-art tech but unwilling to hobble along with turn-of-the-millennium tools. To resolve that dilemma, companies often try to build data lakes as a stepping stone to more comprehensive system upgrades. They reason that if they can systematically hoover up the information stored in those source systems and grant more widespread access to it, they can create more modern applications while letting older systems lumber along in the background. This tactic can work, but it comes with a cost.

To pipe data out of legacy systems reliably, you have to build and maintain reliable pipelines. With a handful of systems and sources, this isn’t hard. But multiply that across the vast array of systems inside most large organizations and you get what has been called pipeline jungles — thickets of expensive data-integration jobs that no one really owns. Anyone who’s ever written or read a service-level agreement knows that a collection of individually strong components can produce a brittle system.

A novel solution to this problem is emerging in the form of data virtualization: a logical data layer that unifies data siloed in disparate systems without actually physically integrating it. At first, the idea sounds a bit silly. Don’t actually pull all of your data together into a communal, capacious tank. Instead, use technology that lets you pretend that you did. Rather than yanking data out of systems, you reach into them and fetch information when you need to use it.

Accessing data in situ, rather than creating infrastructure to shuffle it around, offers a few benefits. You’ll reduce unneeded copying and the wasteful expense of storing duplicate data. You’ll use tools that give you a single route to reach upstream data, avoiding pipeline jungles to wade through. And by combining these tools with gateways and intelligently constructed interfaces, you can implement detailed permissions and security protocols far more easily.

Orient technology around decision-making. To create value from investing in technology, companies need to be clear on where the value lies in the first place. Companies can’t function without any technology, but surely the goal of better systems should be to function more effectively. In business, this imperative boils down to the goal of making better decisions. Generally, businesses don’t make money by chance but by choice. Behind every value-creating action, there is a long string of decisions about how, when, and by whom related tasks need to get done. For instance, long before a bank collects payments on a loan, it first needs to identify target customers, devise ways to solicit them, estimate their creditworthiness, create the product, set a rate, originate the loan, and ensure that the systems are ready to service it.

In general, companies should strive to make high-stakes decisions more effective and low-stakes ones more efficient. One national retailer noticed that when its senior-most executives gathered for quarterly planning meetings, they spent nearly all of their time scrutinizing historical sales reports and almost none of it making strategic choices. Why? Culture played a part — dwelling on anomalies and nitpicking was a well-known habit. (“Retail is detail,” as many say.) But there was a deeper, technological failing too.

Over time, the company had steadily improved its reporting capabilities to let users see fresher, more granular information. In contrast, it hadn’t invested in technologies to make predictions or to model scenarios. Users could pinpoint the store that sold the most blueberry yogurt yesterday. But ask them which flavors customers would buy if the blueberry were removed, and they’d have no idea. Ask them whether they should invest to build more stores or to reduce prices, and you’d hear a similar silence.

The executives hatched a bold plan to reorient themselves. They resolved to create what they called “living, 18-month plans” that codified their strategic choices — for instance, levels of discounting or footprint expansion — as well as robust forecasts of expected performance. During quarterly meetings, leaders updated the plan. What’s more, they insisted on the ability to quantify the business impacts of varying those choices as interactively as they could. The company needed to redirect investments away from reporting and toward modeling — fewer dashboards and more decision-support tools. But tailoring their technology to better suit the company’s key decisions meant that the leadership team could focus its collective attention on the choices that mattered most.

Enhancing low-stakes decisions can generate sizable profits too, if done frequently or on a large scale. A grocery store chain spent years agonizing over an elaborate, item-level pricing system. To work, it needed lots of data. Some inputs should have been simple to obtain, like statistics on past prices and sales, but analysts were stymied by having to collate data from different legacy systems. Other inputs, like competitive price data, were costly to buy and error-prone. To function, the system needed to build, store, and update tens of thousands of narrow statistical models, each one of which could fail from an input glitch or return nonsensical results because of an outlier. It was, in other words, a mess.

Dismayed by what he saw, the CEO asked a provocative question: Was “the juice worth the squeeze”? In other words, what if, instead of updating the complicated system, the retailer simply stopped setting item prices itself?

Instead of deploying large teams to chase down tiny details, the retailer could ask small teams of buyers and category managers to apply the same margin to all items in a category, effectively shifting the burden of setting product pricing back to the vendors. If a brand wished to undercut its rivals, it could do so by lowering its cost to the retailer — but that dynamic would require zero effort from the retailer. The retailer, in turn, could adjust its overall level of price competitiveness by altering margin targets across categories.

Though the idea struck some as heresy, it gained currency as its assumptions and implications became clear. It might be possible to measure differences in price sensitivities between two similar cans of beans, as pricing folklore suggests, but the evidence is weak. Hence, this kind of decision is likely to be a low-stakes one where the best way to win is to be more efficient. Automating such decisions often makes them more efficient; offloading them entirely always does so.

Your thinking may evolve substantially as you upgrade your systems, but working in small chunks makes pivoting far easier. You have room to change your mind.

Reduce your delivery “chunk size.” One final strategy that companies can use to good effect when revamping enterprise systems is to deliver smaller, self-contained, complete units of work. As a thought experiment, think of how much you’re willing to spend on your next tech transformation and how long you expect it to take. Divide both numbers by 50. You should aim to organize your work so that if you spend that diminished sum over that brief time interval, you get an entirely functional systems component, like a Lego brick that’s yours to build with. Maybe it’s a service encapsulated in a container, or a set of interfaces for accessing data programmatically — but whatever you build, you want it to be feature-complete and immediately reusable.

Operating in this way embeds the kind of second derivative thinking described above. Your thinking may evolve substantially as you upgrade your systems, but working in small chunks makes pivoting far easier. You have room to change your mind.

Furthermore, teams can use and derive value from individual components as they come online, rather than waiting for the entire edifice to be built. And giving teams new technical possibilities is a great way to unleash their creativity and empower them to solve problems you may not have considered.

The experience of a large commercial insurer provides an example of how to put these ideas into practice. Like many of its peers, this company felt both the drag of legacy systems and the fear of taking them off life support. Rather than rebuilding them in one go, the company chose to isolate its internal systems by putting legacy applications in a distinct service tier. Then it built a separate services layer on top of those applications. Those services provided users with access to the data and functions of the underlying legacy systems while abstracting away the need to interface with them directly. Legacy systems were in effect shrink-wrapped for a longer shelf life, and more legacy systems could be moved into the first, internal tier one by one.

To move a system behind this services layer, teams needed to think carefully about which of its data elements and functions were critical to the company’s operations. Not only did the exercise allow them to incrementally disentangle their tangled legacy setup; it also gave them a road map for progressively building more modern applications.

Leave an Agile Legacy

You don’t have to be a technical expert to be astonished by the increasing potential for technologies to transform organizations. Fifty-one years ago, the United States piloted a spaceship to the surface of the moon using a 70-pound computer that could perform 14,245 calculations per second. In September of this year, Nvidia introduced a graphics processor that is more than 2.5 billion times as fast, weighing little more than a book.

Currently, high-capacity hard drives can store data at a cost of just over $15 per terabyte. When we first walked on the moon, storing that terabyte would have cost $1.7 billion in today’s dollars. This August, researchers pushed data through a single fiber-optic cable at a rate of 178 terabits per second — enough to transfer about 1,500 4K movies in the time it takes to say “one, Mississippi.” When the Apollo 11 astronauts splashed back down to Earth, the first computer-to-computer link hadn’t yet been invented. (It would come three months later, with Arpanet.)

Given the giant leaps forward that technology continually makes, it’s now possible for companies to replace their legacy systems and aging computing platforms with systems that will enable them to realize breakthroughs in terms of greater efficiencies, but also with greater product and service offerings. The key is to recognize how your legacy systems must adapt in a world where accelerating change is the only constant.

Topics

Building a Winning Data Strategy

Building a winning data strategy requires bold moves and new ideas. Creating a strong data foundation within the organization and putting a premium on nontechnical factors such as analytical agility and culture can help companies stay ahead. This MIT SMR Executive Guide, published as a series over three weeks, offers insights on how companies can move forward with data in an era of constant change.

Brought to you by

See All Articles in This Series

How Organizations Can Build Analytics Agility

Topics

Building a Winning Data Strategy

Building a winning data strategy requires bold moves and new ideas. Creating a strong data foundation within the organization and putting a premium on nontechnical factors such as analytical agility and culture can help companies stay ahead. This MIT SMR Executive Guide, published as a series over three weeks, offers insights on how companies can move forward with data in an era of constant change.

Brought to you by

See All Articles in This Series

In an era of constant change, companies’ data and analytics capabilities must rapidly adapt to ensure that the business survives, never mind competes. Organizations seek insights from their data to inform strategic priorities in real time, yet much of the historical data and modeling formerly applied to predict future behavior and guide actions are proving to be far less predictive, or even irrelevant, in our current normal with COVID-19.

In order to survive through crises, proactively detect trends, and respond to new challenges, companies need to develop greater analytical agility. This agility comes from three areas: improving the quality and connections of the data itself, augmenting analytical “horsepower” at the organization level, and leveraging talent that is capable of bridging business needs with analytics to find opportunity in the data.

The Answers Are in the Data

The quest for better data is not new, but the cost of not having it is easier to substantiate and understand in a time of crisis. Gaps in data quality — whether it’s time-lagged, disconnected, insufficient in granularity, or poorly curated (rendering analysis slow or impossible) — become intolerable amid chaos when companies must act quickly. Crises can be opportunities to augment data quality and further enrich the data to better serve customers and the company.

Making the business case for data investments suddenly makes sense as business leaders live through data gap implications in real time. Monetizing data typically comes from four sources:

  • Connecting data with other data differently than before.
  • Getting new data sources, or more specific levels of the same data you already had.
  • Putting data to better or faster use than the competition.
  • Getting data faster.

Data and analytics leaders must frame investments in the current context and prioritize data investments wisely by taking a complete view of what is happening to the business across a number of functions. For example, customers bank very differently in a time of crisis, and this requires banks to change how they operate in order to accommodate them. The COVID-19 pandemic forced banks to take another look at the multiple channels their customers traverse — branches, mobile, online banking, ATMs — and how their comfort levels with each shifted. How customers bank, and what journeys they engage in at what times and in what sequence, are all highly relevant to helping them achieve their financial goals. The rapid collection and analysis of data from across channels, paired with key economic factors, provided context that allowed banks to better serve customers in the moment. New and different sources of information — be it transaction-level data, payment behaviors, or real-time credit bureau information — can help ensure that customer credit is protected and that fraudulent activity is kept at bay.

Making the business case for data investments suddenly makes sense as business leaders live through data gap implications in real time.

Every data and analytics team has a roster of data demands that outpace budgets. Foundational data investments often languish and are perpetually underfunded because their value is difficult to isolate and describe to others. Events such as COVID-19 pinpoint investments that are likely already on the list but lack sufficient organizational buy-in to propel them forward. As the adage goes, never let a good crisis go to waste — use it to enrich your data and customer understanding.

Augment Analytical Horsepower With Business Parameters

Experienced analysts recognize that all analysis requires a blend of art and science. However, the nature of crises and unusual events means that analysts simply do not have the standard observation windows on which to build their typical projections and formulate baselines. History is no longer as useful. Moreover, rapidly changing market conditions require teams to constantly readjust models and analytical approaches to stay current. This requires adequate datasense (a blend of analytical facts and business intuition) in order to set the right parameters and optimize the business.

We often think of business rules as the antithesis of data-driven decisions or models, because they introduce a layer of subjectivity into a world that thrives on objectivity. In an unrecognizable crisis, however, establishing these parameters is important for a business to function practically and make basic analytical decisions. In our banking example, consider the customer conversations happening within a branch network throughout COVID-19. Historically, there has been a set capacity available to handle and fulfill requests, and the nature of conversations — whether financial check-ins, mortgage renewals, or efforts to ensure that customers have the right financial solutions to meet their immediate needs based on emerging life events — has been known and model-driven.

In the current reality, that constraint-based optimization problem is turned on its head because customer needs are vastly different, staff availability is in flux, branch hours are changing, and orchestrating effective customer conversations requires new analytics. Augmenting the process with business rules helps define the parameters of what can be done. Next, the whole approach to customer conversation optimization needs to be recalibrated (across branches and every channel) based on circumstances that are changing daily because what you are optimizing for has changed. And although machine learning and automation can help, the vast aberration in the data from the pandemic means that it will take time for such solutions to adapt and become relevant.

This reveals a need for a different and augmented approach to analytics that suits the time. It necessitates a greater blend of “art and science,” of “(wo)man and machine,” and of business rules and models in order to navigate gray areas.

Business-Analytics Hybrids

Navigating uncertainty and responding to change requires an exceptional translation layer — a team of individuals whose skill sets blend a superior understanding of the business with the technical acumen to transform data into insight. The world is always hungry for these skills, and in the future it will make the difference between brands that are nimble and thrive and those that languish.

The COVID-19 pandemic has been characterized by rapid unpredictability and never-before-seen trends. For businesses to adapt and make key strategic changes in short order, they need teams with hybrid skills, capable of both finding opportunity in data and executing quickly and accurately when the business knows what it wants to do.

In a crisis, data and analytics can overwhelm leaders and prevent them from acting quickly on account of the sheer volume, pace, and continuous shifting of data. In order to break down emergent trends and properly contextualize them, organizational functions must come together in new ways. While few businesses ever anticipated needing to transform into operating as fully remote workforces, many are seeing that the intense collaboration and connectedness of their people have formed strong virtual networks that are at the heart of their survival.

A Culture of Analytics and Business Collaboration

When teams come together to interrogate new, shifting data from multiple perspectives, they begin to gain comfort in establishing more “knowns” in an unknown world. The result is that leaders can make the best possible decisions in the most rapid time frame. Done well, this will help companies thrive in disruption and gain competitive advantage, but it requires a high level of analytical literacy throughout the business and, most important, a culture of collaboration.

Facilitating these ongoing exchanges can happen in a variety of forums, ranging from the formal (such as customer optimization forums and risk exchanges) to the informal (such as real-time dashboards and views of computations on the fly). It’s important that this collaboration is continuous, interactive, and inclusive, with both business and analytical teams present so that the data is properly interpreted and all stakeholders understand any actions that are required.

In order to detect and respond to disruptive events with agility, companies must increase their analytical fitness and develop strong muscle memory when they are put to the test. Navigating the pandemic often feels like running a marathon at sprint speed into the dark, and an agile approach to data and analytics will be the headlamp that companies cannot do without.

Topics

Building a Winning Data Strategy

Building a winning data strategy requires bold moves and new ideas. Creating a strong data foundation within the organization and putting a premium on nontechnical factors such as analytical agility and culture can help companies stay ahead. This MIT SMR Executive Guide, published as a series over three weeks, offers insights on how companies can move forward with data in an era of constant change.

Brought to you by

See All Articles in This Series

“Requiem For Alonzo: The Human Toll of Police Brutality” | Theo E.J. Wilson | TEDxMileHigh

“Requiem For Alonzo: The Human Toll of Police Brutality" | Theo E.J. Wilson | TEDxMileHigh

Alonzo Ashley was just 29 years old when he was killed at the hands of the Denver Police. In this heartwrenching tribute, poet Theo E.J. Wilson explores the meaning of Black Lives Matters & imagines another world — one where the victims of police brutality live on. Theo E.J. Wilson (aka Lucifury) is a founding member of the Denver Slam Nuba team, which won the National Poetry Slam in 2011. He began his speaking career in the N.A.A.C.P. at the age of 15, and has always had a passion for social justice. Wilson is Executive Director of Shop Talk Live, Inc., an organization that uses the barbershop as a staging ground for community dialogue and healing. In 2017, he published his first book, The Law of Action. His TEDxMileHigh talk, “A black man goes undercover in the alt-right,” has over 10 million views online. This talk was given at a TEDx event using the TED conference format but independently organized by a local community. Learn more at https://www.ted.com/tedx