Search results for: marketing theory and applications
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 11349

Search results for: marketing theory and applications

10419 Application of Nanoparticles in Biomedical and MRI

Authors: Raziyeh Mohammadi

Abstract:

At present, nanoparticles are used for various biomedical applications where they facilitate laboratory diagnostics and therapeutics. The performance of nanoparticles for biomedical applications is often assessed by their narrow size distribution, suitable magnetic saturation, and low toxicity effects. Superparamagnetic iron oxide nanoparticles have received great attention due to their applications as contrast agents for magnetic resonance imaging (MRI. (Processes in the tissue where the blood brain barrier is intact in this way shielded from the contact to this conventional contrast agent and will only reveal changes in the tissue if it involves an alteration in the vasculature. This technique is very useful for detecting tumors and can even be used for detecting metabolic functional alterations in the brain, such as epileptic activity.SPIONs have found application in Magnetic Resonance Imaging (MRI) and magnetic hyperthermia. Unlike bulk iron, SPIONs do not have remnant magnetization in the absence of the external magnetic field; therefore, a precise remote control over their action is possible.

Keywords: nanoparticles, MRI, biomedical, iron oxide, spions

Procedia PDF Downloads 193
10418 A Collective Intelligence Approach to Safe Artificial General Intelligence

Authors: Craig A. Kaplan

Abstract:

If AGI proves to be a “winner-take-all” scenario where the first company or country to develop AGI dominates, then the first AGI must also be the safest. The safest, and fastest, path to Artificial General Intelligence (AGI) may be to harness the collective intelligence of multiple AI and human agents in an AGI network. This approach has roots in seminal ideas from four of the scientists who founded the field of Artificial Intelligence: Allen Newell, Marvin Minsky, Claude Shannon, and Herbert Simon. Extrapolating key insights from these founders of AI, and combining them with the work of modern researchers, results in a fast and safe path to AGI. The seminal ideas discussed are: 1) Society of Mind (Minsky), 2) Information Theory (Shannon), 3) Problem Solving Theory (Newell & Simon), and 4) Bounded Rationality (Simon). Society of Mind describes a collective intelligence approach that can be used with AI and human agents to create an AGI network. Information theory helps address the critical issue of how an AGI system will increase its intelligence over time. Problem Solving Theory provides a universal framework that AI and human agents can use to communicate efficiently, effectively, and safely. Bounded Rationality helps us better understand not only the capabilities of SuperIntelligent AGI but also how humans can remain relevant in a world where the intelligence of AGI vastly exceeds that of its human creators. Each key idea can be combined with recent work in the fields of Artificial Intelligence, Machine Learning, and Large Language Models to accelerate the development of a working, safe, AGI system.

Keywords: AI Agents, Collective Intelligence, Minsky, Newell, Shannon, Simon, AGI, AGI Safety

Procedia PDF Downloads 69
10417 A Theory-Based Analysis on Implications of Democracy in Cambodia

Authors: Puthsodary Tat

Abstract:

Democracy has been categorially accepted and used as foreign and domestic policy agendas for the hope of peace, economic growth and prosperity for more than 25 years in Cambodia. However, the country is now in the grip of dictatorship, human rights violations, and prospective economic sanctions. This paper examines different perceptions and experiences of democratic assistance. In this study, the author employs discourse theory, idealism and realism as a theory-based methodology for debating and assessing the implications of democratization. Discourse theory is used to establish a platform for understanding discursive formations, body of knowledge and the games of truth of democracy. Idealist approaches give rational arguments for adopting key tenets that work well on the ground. In contrast, realism allows for some sweeping critiques of utopian ideal and offers particular views on why Western hegemonic missions do not work well. From idealist views, the research finds that Cambodian people still believe that democracy is a prima facie universality for peace, growth and prosperity. From realism, democratization is on the brink of death in three reasons. Firstly, there are tensions between Western and local discourses about democratic values and norms. Secondly, democratic tenets have been undermined by the ruling party-controlled courts, corruption, structural oppression and political patronage-based institutions. The third pitfall is partly associated with foreign aid dependency and geopolitical power struggles in the region. Finally, the study offers a precise mosaic of democratic principles that may be used to avoid a future geopolitical and economic crisis.

Keywords: corruption, democracy, democratic principles, discourse theory, discursive formations, foreign aid dependency, games of truth, geopolitical and economic crisis, geopolitical power struggle, hegemonic mission, idealism, realism, utopian ideal

Procedia PDF Downloads 187
10416 Elastic and Plastic Collision Comparison Using Finite Element Method

Authors: Gustavo Rodrigues, Hans Weber, Larissa Driemeier

Abstract:

The prevision of post-impact conditions and the behavior of the bodies during the impact have been object of several collision models. The formulation from Hertz’s theory is generally used dated from the 19th century. These models consider the repulsive force as proportional to the deformation of the bodies under contact and may consider it proportional to the rate of deformation. The objective of the present work is to analyze the behavior of the bodies during impact using the Finite Element Method (FEM) with elastic and plastic material models. The main parameters to evaluate are, the contact force, the time of contact and the deformation of the bodies. An advantage of using the FEM approach is the possibility to apply a plastic deformation to the model according to the material definition: there will be used Johnson–Cook plasticity model whose parameters are obtained through empirical tests of real materials. This model allows analyzing the permanent deformation caused by impact, phenomenon observed in real world depending on the forces applied to the body. These results are compared between them and with the model-based Hertz theory.

Keywords: collision, impact models, finite element method, Hertz Theory

Procedia PDF Downloads 158
10415 Modeling and Simulation of the Structural, Electronic and Magnetic Properties of Fe-Ni Based Nanoalloys

Authors: Ece A. Irmak, Amdulla O. Mekhrabov, M. Vedat Akdeniz

Abstract:

There is a growing interest in the modeling and simulation of magnetic nanoalloys by various computational methods. Magnetic crystalline/amorphous nanoparticles (NP) are interesting materials from both the applied and fundamental points of view, as their properties differ from those of bulk materials and are essential for advanced applications such as high-performance permanent magnets, high-density magnetic recording media, drug carriers, sensors in biomedical technology, etc. As an important magnetic material, Fe-Ni based nanoalloys have promising applications in the chemical industry (catalysis, battery), aerospace and stealth industry (radar absorbing material, jet engine alloys), magnetic biomedical applications (drug delivery, magnetic resonance imaging, biosensor) and computer hardware industry (data storage). The physical and chemical properties of the nanoalloys depend not only on the particle or crystallite size but also on composition and atomic ordering. Therefore, computer modeling is an essential tool to predict structural, electronic, magnetic and optical behavior at atomistic levels and consequently reduce the time for designing and development of new materials with novel/enhanced properties. Although first-principles quantum mechanical methods provide the most accurate results, they require huge computational effort to solve the Schrodinger equation for only a few tens of atoms. On the other hand, molecular dynamics method with appropriate empirical or semi-empirical inter-atomic potentials can give accurate results for the static and dynamic properties of larger systems in a short span of time. In this study, structural evolutions, magnetic and electronic properties of Fe-Ni based nanoalloys have been studied by using molecular dynamics (MD) method in Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) and Density Functional Theory (DFT) in the Vienna Ab initio Simulation Package (VASP). The effects of particle size (in 2-10 nm particle size range) and temperature (300-1500 K) on stability and structural evolutions of amorphous and crystalline Fe-Ni bulk/nanoalloys have been investigated by combining molecular dynamic (MD) simulation method with Embedded Atom Model (EAM). EAM is applicable for the Fe-Ni based bimetallic systems because it considers both the pairwise interatomic interaction potentials and electron densities. Structural evolution of Fe-Ni bulk and nanoparticles (NPs) have been studied by calculation of radial distribution functions (RDF), interatomic distances, coordination number, core-to-surface concentration profiles as well as Voronoi analysis and surface energy dependences on temperature and particle size. Moreover, spin-polarized DFT calculations were performed by using a plane-wave basis set with generalized gradient approximation (GGA) exchange and correlation effects in the VASP-MedeA package to predict magnetic and electronic properties of the Fe-Ni based alloys in bulk and nanostructured phases. The result of theoretical modeling and simulations for the structural evolutions, magnetic and electronic properties of Fe-Ni based nanostructured alloys were compared with experimental and other theoretical results published in the literature.

Keywords: density functional theory, embedded atom model, Fe-Ni systems, molecular dynamics, nanoalloys

Procedia PDF Downloads 229
10414 Save Balance of Power: Can We?

Authors: Swati Arun

Abstract:

The present paper argues that Balance of Power (BOP) needs to conjugate with certain contingencies like geography. It is evident that sea powers (‘insular’ for better clarity) are not balanced (if at all) in the same way as land powers. Its apparent that artificial insularity that the US has achieved reduces the chances of balancing (constant) and helps it maintain preponderance (variable). But how precise is this approach in assessing the dynamics between China’s rise and reaction of other powers and US. The ‘evolved’ theory can be validated by putting China and US in the equation. Systemic Relation between the nations was explained through the Balance of Power theory much before the systems theory was propounded. The BOP is the crux of functionality of ‘power relation’ dynamics which has played its role in the most astounding ways leading to situations of war and peace. Whimsical; but true that, the BOP has remained a complicated and indefinable concepts since Hans. Morganthau to Kenneth Waltz. A challenge of the BOP, however remains; “ that it has too many meanings”. In the recent times it has become evident that the myriad of expectations generated by BOP has not met the practicality of the current world politics. It is for this reason; the BoP has been replaced by Preponderance Theory (PT) to explain prevailing power situation. PT does provide an empirical reasoning for the success of this theory but fails in a abstract logical reasoning required for making a theory universal. Unipolarity clarifies the current system as one where balance of power has become redundant. It seems to reach beyond the contours of BoP, where a superpower does what it must to remain one. The centrality of this arguments pivots around - an exception, every time BOP fails to operate, preponderance of power emerges. PT does not sit well with the primary logic of a theory because it works on an exception. The evolution of such a pattern and system where BOP fails and preponderance emerges is absent. The puzzle here is- if BOP really has become redundant or it needs polishing. The international power structure changed from multipolar to bipolar to unipolar. BOP was looked at to provide inevitable logic behind such changes and answer the dilemma we see today- why US is unchecked, unbalanced? But why was Britain unchecked in 19th century and why China was unbalanced in 13th century? It is the insularity of the state that makes BOP reproduce “imbalance of power”, going a level up from off-shore balancer. This luxury of a state to maintain imbalance in the region of competition or threat is the causal relation between BOP’s and geography. America has applied imbalancing- meaning disequilibrium (in its favor) to maintain the regional balance so that over time the weaker does not get stronger and pose a competition. It could do that due to the significant parity present between the US and the rest.

Keywords: balance of power, china, preponderance of power, US

Procedia PDF Downloads 264
10413 Algorithms for Run-Time Task Mapping in NoC-Based Heterogeneous MPSoCs

Authors: M. K. Benhaoua, A. K. Singh, A. E. Benyamina, P. Boulet

Abstract:

Mapping parallelized tasks of applications onto these MPSoCs can be done either at design time (static) or at run-time (dynamic). Static mapping strategies find the best placement of tasks at design-time, and hence, these are not suitable for dynamic workload and seem incapable of runtime resource management. The number of tasks or applications executing in MPSoC platform can exceed the available resources, requiring efficient run-time mapping strategies to meet these constraints. This paper describes a new Spiral Dynamic Task Mapping heuristic for mapping applications onto NoC-based Heterogeneous MPSoC. This heuristic is based on packing strategy and routing Algorithm proposed also in this paper. Heuristic try to map the tasks of an application in a clustering region to reduce the communication overhead between the communicating tasks. The heuristic proposed in this paper attempts to map the tasks of an application that are most related to each other in a spiral manner and to find the best possible path load that minimizes the communication overhead. In this context, we have realized a simulation environment for experimental evaluations to map applications with varying number of tasks onto an 8x8 NoC-based Heterogeneous MPSoCs platform, we demonstrate that the new mapping heuristics with the new modified dijkstra routing algorithm proposed are capable of reducing the total execution time and energy consumption of applications when compared to state-of-the-art run-time mapping heuristics reported in the literature.

Keywords: multiprocessor system on chip, MPSoC, network on chip, NoC, heterogeneous architectures, run-time mapping heuristics, routing algorithm

Procedia PDF Downloads 475
10412 Frequency Interpretation of a Wave Function, and a Vertical Waveform Treated as A 'Quantum Leap'

Authors: Anthony Coogan

Abstract:

Born’s probability interpretation of wave functions would have led to nearly identical results had he chosen a frequency interpretation instead. Logically, Born may have assumed that only one electron was under consideration, making it nonsensical to propose a frequency wave. Author’s suggestion: the actual experimental results were not of a single electron; rather, they were groups of reflected x-ray photons. The vertical waveform used by Scrhödinger in his Particle in the Box Theory makes sense if it was intended to represent a quantum leap. The author extended the single vertical panel to form a bar chart: separate panels would represent different energy levels. The proposed bar chart would be populated by reflected photons. Expansion of basic ideas: Part of Scrhödinger’s ‘Particle in the Box’ theory may be valid despite negative criticism. The waveform used in the diagram is vertical, which may seem absurd because real waves decay at a measurable rate, rather than instantaneously. However, there may be one notable exception. Supposedly, following from the theory, the Uncertainty Principle was derived – may a Quantum Leap not be represented as an instantaneous waveform? The great Scrhödinger must have had some reason to suggest a vertical waveform if the prevalent belief was that they did not exist. Complex wave forms representing a particle are usually assumed to be continuous. The actual observations made were x-ray photons, some of which had struck an electron, been reflected, and then moved toward a detector. From Born’s perspective, doing similar work the years in question 1926-7, he would also have considered a single electron – leading him to choose a probability distribution. Probability Distributions appear very similar to Frequency Distributions, but the former are considered to represent the likelihood of future events. Born’s interpretation of the results of quantum experiments led (or perhaps misled) many researchers into claiming that humans can influence events just by looking at them, e.g. collapsing complex wave functions by 'looking at the electron to see which slit it emerged from', while in reality light reflected from the electron moved in the observer’s direction after the electron had moved away. Astronomers may say that they 'look out into the universe' but are actually using logic opposed to the views of Newton and Hooke and many observers such as Romer, in that light carries information from a source or reflector to an observer, rather the reverse. Conclusion: Due to the controversial nature of these ideas, especially its implications about the nature of complex numbers used in applications in science and engineering, some time may pass before any consensus is reached.

Keywords: complex wave functions not necessary, frequency distributions instead of wave functions, information carried by light, sketch graph of uncertainty principle

Procedia PDF Downloads 188
10411 Improved Simultaneous Performance in the Time Domain and in the Frequency Domain

Authors: Azeddine Ghodbane, David Bensoussan, Maher Hammami

Abstract:

An innovative approach for controlling unstable and invertible systems has demonstrated superior performance compared to conventional controllers. It has been successfully applied to a levitation system and drone control. Simulations have yielded satisfactory performances when applied to a satellite antenna controller. This design method, based on sensitivity analysis, has also been extended to handle multivariable unstable and invertible systems that exhibit dominant diagonal characteristics at high frequencies, enabling decentralized control. Furthermore, this control method has been expanded to the realm of adaptive control. In this study, we introduce an alternative adaptive architecture that enhances both time and frequency performance, helpfully mitigating the effects of disturbances from the input plant and external disturbances affecting the output. To facilitate superior performance in both the time and frequency domains, we have developed user-friendly interactive design methods using the GeoGebra platform.

Keywords: control theory, decentralized control, sensitivity theory, input-output stability theory, robust multivariable feedback control design

Procedia PDF Downloads 95
10410 Second Time’s a Charm: The Intervention of the European Patent Office on the Strategic Use of Divisional Applications

Authors: Alissa Lefebre

Abstract:

It might seem intuitive to hope for a fast decision on the patent grant. After all, a granted patent provides you with a monopoly position, which allows you to obstruct others from using your technology. However, this does not take into account the strategic advantages one can obtain from keeping their patent applications pending. First, you have the financial advantage of postponing certain fees, although many applicants would probably agree that this is not the main benefit. As the scope of the patent protection is only decided upon at the grant, the pendency period introduces uncertainty amongst rivals. This uncertainty entails not knowing whether the patent will actually get granted and what the scope of protection will be. Consequently, rivals can only depend upon limited and uncertain information when deciding what technology is worth pursuing. One way to keep patent applications pending, is the use of divisional applications. These applicants can be filed out of a parent application as long as that parent application is still pending. This allows the applicant to pursue (part of) the content of the parent application in another application, as the divisional application cannot exceed the scope of the parent application. In a fast-moving and complex market such as the tele- and digital communications, it might allow applicants to obtain an actual monopoly position as competitors are discouraged to pursue a certain technology. Nevertheless, this practice also has downsides to it. First of all, it has an impact on the workload of the examiners at the patent office. As the number of patent filings have been increasing over the last decades, using strategies that increase this number even more, is not desirable from the patent examiners point of view. Secondly, a pending patent does not provide you with the protection of a granted patent, thus not only create uncertainty for the rivals, but also for the applicant. Consequently, the European patent office (EPO) has come up with a “raising the bar initiative” in which they have decided to tackle the strategic use of divisional applications. Over the past years, two rules have been implemented. The first rule in 2010 introduced a time limit, upon which divisional applications could only be filed within a 24-month limit after the first communication with the patent office. However, after carrying-out a user feedback survey, the EPO abolished the rule again in 2014 and replaced it by a fee mechanism. The fee mechanism is still in place today, which might be an indication of a better result compared to the first rule change. This study tests the impact of these rules on the strategic use of divisional applications in the tele- and digital communication industry and provides empirical evidence on their success. Upon using three different survival models, we find overall evidence that divisional applications prolong the pendency time and that only the second rule is able to tackle the strategic patenting and thus decrease the pendency time.

Keywords: divisional applications, regulatory changes, strategic patenting, EPO

Procedia PDF Downloads 113
10409 Media Engagement and Ethnic Identity: The Case of the Aeta Ambala of Pastolan Village

Authors: Kriztine R. Viray, Chona Rita R. Cruz

Abstract:

The paper explores the engagement of indigenous group, Aeta Ambala with different media and how this engagement affects their perception of their own ethnic identity. The researchers employed qualitative research as their approach and descriptive research method as their design. The paper integrates two theories. These are communication theory of identity by Michael Hecht and the Uses and Gratification Theory of Katz, Blumler, and Gurevitch. Among others, the paper exposes that the engagement of the Aeta-Ambala with the various forms of media certainly affected the way they perceived the outside world and their own ethnic group.

Keywords: Aeta Ambala, culture, ethnic, media engagement, Philippines

Procedia PDF Downloads 474
10408 Healthcare Social Entrepreneurship: A Positive Theory Applied to the Case of YOU Foundation in Nepal

Authors: Simone Rondelli, Damiano Rondelli, Bishesh Poudyal, Juan Jose Cabrera-Lazarini

Abstract:

One of the main obstacles for Social Entrepreneurship is to find a business model that is financially sustainable. In other words, the captured value generates enough cash flow to ensure business continuity and reinvestment for growth. Providing Health Services in poor countries for the uninsured population affected by a high-cost chronical disease is not the exception for this challenge. As a prime example, cancer has become a high impact on a global disease not only because of the high morbidity but also of the financial impact on both the patient family and health services in underdeveloped countries. Therefore, it is relevant to find a Social Entrepreneurship Model that provides affordable treatment for this disease while maintaining healthy finances not only for the patient but also for the organization providing the treatment. Using the methodology of Constructive Research, this paper applied a Positive Theory and four business models of Social Entrepreneurship to a case of a Private Foundation model whose mission is to address the challenge previously described. It was found that the Foundation analyzed, in this case, is organized as an Embedded Business Model and complies with the four propositions of the Positive Theory considered. It is recommended for this Private Foundation to explore implementing the Integrated Business Model to ensure more robust sustainability in the long term. It evolves as a scalable model that can attract investors interested in contributing to expanding this initiative globally.

Keywords: affordable treatment, global healthcare, social entrepreneurship theory, sustainable business model

Procedia PDF Downloads 122
10407 A Comparative Study of Generalized Autoregressive Conditional Heteroskedasticity (GARCH) and Extreme Value Theory (EVT) Model in Modeling Value-at-Risk (VaR)

Authors: Longqing Li

Abstract:

The paper addresses the inefficiency of the classical model in measuring the Value-at-Risk (VaR) using a normal distribution or a Student’s t distribution. Specifically, the paper focuses on the one day ahead Value-at-Risk (VaR) of major stock market’s daily returns in US, UK, China and Hong Kong in the most recent ten years under 95% confidence level. To improve the predictable power and search for the best performing model, the paper proposes using two leading alternatives, Extreme Value Theory (EVT) and a family of GARCH models, and compares the relative performance. The main contribution could be summarized in two aspects. First, the paper extends the GARCH family model by incorporating EGARCH and TGARCH to shed light on the difference between each in estimating one day ahead Value-at-Risk (VaR). Second, to account for the non-normality in the distribution of financial markets, the paper applies Generalized Error Distribution (GED), instead of the normal distribution, to govern the innovation term. A dynamic back-testing procedure is employed to assess the performance of each model, a family of GARCH and the conditional EVT. The conclusion is that Exponential GARCH yields the best estimate in out-of-sample one day ahead Value-at-Risk (VaR) forecasting. Moreover, the discrepancy of performance between the GARCH and the conditional EVT is indistinguishable.

Keywords: Value-at-Risk, Extreme Value Theory, conditional EVT, backtesting

Procedia PDF Downloads 305
10406 Affordances in Boating Performative Practices: The Case of Leisure Boating from the Swedish West Coast

Authors: Neva Leposa

Abstract:

While environmental policy makers are trying to increase pro-environmental behavior among tourists or outdoor recreation users through changing users’ attitudes, the focus of this paper is turned to the importance of so far marginalized – materiality in the users’ practices. The case study of leisure boating in Sweden used in this paper demonstrates how through the change of materiality (i.e. equipment and physical size of the leisure boats) emergent affordances in materially bound practices are transformed, and the boater-boat-sea nexus is redefined. Participatory observation and in-depth interviewing of Swedish West Coast visitors reveal two stories, first one points to the fact that sail boating practice is becoming increasingly motorized and second one describes how leisure boats are becoming increasingly perceived and used as mobile summer houses. Hence, such practice increases energy and matter consumption. This paper describes how that change happens through practice theory and affordance theory, thus points to visibility and the importance of materiality in shaping human nature nexus. Boating practice changes through the change of the materiality of the boats. In particular, energy consumption increases through the change of engagement with the matter. This study puts focus environmental attitudes focused strivings in question, for the fact that it is too individual-centered and lacks contextual understanding of the materially bound practices and may fail in the very thing it is aiming to do - reduce the environmental impacts.

Keywords: practice theory, affordance theory, leisure boating, materiality

Procedia PDF Downloads 254
10405 Theory of Mind and Its Brain Distribution in Patients with Temporal Lobe Epilepsy

Authors: Wei-Han Wang, Hsiang-Yu Yu, Mau-Sun Hua

Abstract:

Theory of Mind (ToM) refers to the ability to infer another’s mental state. With appropriate ToM, one can behave well in social interactions. A growing body of evidence has demonstrated that patients with temporal lobe epilepsy (TLE) may have damaged ToM due to impact on regions of the underlying neural network of ToM. However, the question of whether there is cerebral laterality for ToM functions remains open. This study aimed to examine whether there is cerebral lateralization for ToM abilities in TLE patients. Sixty-seven adult TLE patients and 30 matched healthy controls (HC) were recruited. Patients were classified into right (RTLE), left (LTLE), and bilateral (BTLE) TLE groups on the basis of a consensus panel review of their seizure semiology, EEG findings, and brain imaging results. All participants completed an intellectual test and four tasks measuring basic and advanced ToM. The results showed that, on all ToM tasks; (1)each patient group performed worse than HC; (2)there were no significant differences between LTLE and RTLE groups; (3)the BTLE group performed the worst. It appears that the neural network responsible for ToM is distributed evenly between the cerebral hemispheres.

Keywords: cerebral lateralization, social cognition, temporal lobe epilepsy, theory of mind

Procedia PDF Downloads 407
10404 Romantic Theory in Comparative Perspective: Schlegel’s Philosophy of History and the Spanish Question

Authors: Geena Kim

Abstract:

The Romantic movements in Spain and Germany served as turning points in European literary history, advancing cognitive-emotional ideals of the essential unity between literature, life, and the natural world in reaction against the rising tide of mechanization, urban growth, and industrial progress. This paper offers a comparative study of the literary-theoretic underpinnings of the Romantic movements in Spain and Germany, particularly with regard to the reception history of Schlegel’s Romantic philosophy of history. By far one of the better-known figures of the period, Schlegel has traditionally been considered one of the principal theorists of German Romanticism, one of the first to embrace and acknowledge the more radical changes that the movement brought forth. His well-studied contributions to the German Romanticism were certainly significant domestically, but their impact on comparatively less industrialized Spain have been largely neglected, a puzzling oversight in light of Schlegel’s extensive efforts in advocating for the dissemination of Spanish literature under the guise of a kind of pan-European Romanticism. Indeed, Schlegel’s somewhat problematically exoticizing view of Spain as the quintessential embodiment of the spirit of Romanticism was itself enormously influential on the genesis and growth of the Spanish Romantic theory. This was especially significant considering earlier, pre-Romantic tropes of the ‘black legend,’ by which means Spain was demonized with even cruder essentializing, nationalistic language. By comparing Schlegel’s theorizing around Spain with contributions to Romantic theory by Hispanophone writers, this paper sheds light on questions of linguistic identity and national influence from two alas infrequently compared contexts of European Romanticism.

Keywords: schlegel, Spanish romantic theory, German romanticism, romantic philosophy

Procedia PDF Downloads 180
10403 Clash of Civilizations without Civilizational Groups: Revisiting Samuel P. Huntington´s Clash of Civilizations Theory

Authors: Jamal Abdi

Abstract:

This paper is largely a response/critique of Samuel P. Huntington´s Clash of Civilizations thesis. The overriding argument is that Huntington´s thesis is characterized by failure to distinguish between ´groups´ and ´categories´. Multinational civilizations overcoming their internal collective action problems, which would enable them to pursue a unified strategy vis-à-vis the West, is a rather foundational assumption in his theory. Without assigning sufficient intellectual attention to the processes through which multinational civilizations may gain capacity for concerted action i.e. become a group, he contended that the post-cold-war world would be shaped in large measure by interactions among seven or eight major civilizations. Thus, failure in providing a convincing analysis of multi-national civilizations´ transition from categories to groups is a significant weakness in Huntington´s clash theory. It is also suggested that so-called Islamic terrorism and the war on terror is not to be taken as an expression of presence of clash between a Western and an Islamic civilization, as terrorist organizations would be superfluous in a world characterized by clash of civilizations. Consequences of multinational civilizations becoming a group are discussed in relation to contemporary Western superiority.

Keywords: categories, civilizations, clash, groups, groupness

Procedia PDF Downloads 157
10402 Localising the Alien: Language, Literature and Theory in the Indian Classroom

Authors: Asima Ranjan Parhi

Abstract:

English language teaching-learning in higher education departments in Indian and Asian contexts needs to be one of innovation and experimentation rather than rigid prescription. The communicative language teaching has been proposing the context to be of primary importance in this process. Today, English print and electronic media have flooded the market with plenty of material suitable to the classroom context. The entries are poetic, catchy and contain a deliberate method in them which could be utilized to teach not only English language but literature, literary terms and the theory of literature. The Bollywood movies, especially through their songs have been propagating a package which may be useful to teach language and even theory in the sub-continent. While investigating, one may be fascinated to see how such material in the body of media (print and electronic), movies and popular songs generate a data for our classroom in our context, thereby developing a mass language with huge pedagogical implications. Harping on the four skills of teaching and learning of a language in general and English language in particular appears stale and mechanical in a decontextualised, matter of fact classroom. So this discussion visualizes a model beyond these skills as well as the conventional theory, literature, language classroom practices in order to build up a systematic pattern stressing the factors responsible in the particular context, that of specific language, society and culture in tune with language-literature teaching. This study intends to examine certain catchy use of the language entries in mass media which could be in the direction of inviting more such investigations in the Asian context in order to develop a common platform of decolonized pedagogy.

Keywords: pedagogy, electronic media, Bollywood, decolonized, mass media

Procedia PDF Downloads 260
10401 Dual Language Immersion Models in Theory and Practice

Authors: S. Gordon

Abstract:

Dual language immersion is growing fast in language teaching today. This study provides an overview and evaluation of the different models of Dual language immersion programs in US K-12 schools. First, the paper provides a brief current literature review on the theory of Dual Language Immersion (DLI) in Second Language Acquisition (SLA) studies. Second, examples of several types of DLI language teaching models in US K-12 public schools are presented (including 50/50 models, 90/10 models, etc.). Third, we focus on the unique example of DLI education in the state of Utah, a successful, growing program in K-12 schools that includes: French, Chinese, Spanish, and Portuguese. The project investigates the theory and practice particularly of the case of public elementary and secondary school children that study half their school day in the L1 and the other half in the chosen L2, from kindergarten (age 5-6) through high school (age 17-18). Finally, the project takes the observations of Utah French DLI elementary through secondary programs as a case study. To conclude, we look at the principal challenges, pedagogical objectives and outcomes, and important implications for other US states and other countries (such as France currently) that are in the process of developing similar language learning programs.

Keywords: dual language immersion, second language acquisition, language teaching, pedagogy, teaching, French

Procedia PDF Downloads 155
10400 Quantum Dot – DNA Conjugates for Biological Applications

Authors: A. Banerjee, C. Grazon, B. Nadal, T. Pons, Y. Krishnan, B. Dubertret

Abstract:

Quantum Dots (QDs) have emerged as novel fluorescent probes for biomedical applications. The photophysical properties of QDs such as broad absorption, narrow emission spectrum, reduced blinking, and enhanced photostability make them advantageous over organic fluorophores. However, for some biological applications, QDs need to be first targeted to specific intracellular locations. It parallel, base pairing properties and biocompatibility of DNA has been extensively used for biosensing, targetting and intracellular delivery of numerous bioactive agents. The combination of the photophysical properties of QDs and targettability of DNA has yielded fluorescent, stable and targetable nanosensors. QD-DNA conjugates have used in drug delivery, siRNA, intracellular pH sensing and several other applications; and continue to be an active area of research. In this project, a novel method to synthesise QD-DNA conjugates and their applications in bioimaging are investigated. QDs are first solubilized in water using a thiol based amphiphilic co-polymer and, then conjugated to amine functionalized DNA using a heterobifunctional linker. The conjugates are purified by size exclusion chromatography and characterized by UV-Vis absorption and fluorescence spectroscopy, electrophoresis and microscopy. Parameters that influence the conjugation yield such as reducing agents, the excess of salt and pH have been investigated in detail. In optimized reaction conditions, up to 12 single-stranded DNA (15 mer length) can be conjugated per QD. After conjugation, the QDs retain their colloidal stability and high quantum yield; and the DNA is available for hybridization. The reaction has also been successfully tested on QDs emitting different colors and on Gold nanoparticles and therefore highly generalizable. After extensive characterization and robust synthesis of QD-DNA conjugates in vitro, the physical properties of these conjugates in cellular milieu are being invistigated. Modification of QD surface with DNA appears to remarkably alter the fate of QD inside cells and can have potential implications in therapeutic applications.

Keywords: bioimaging, cellular targeting, drug delivery, photostability

Procedia PDF Downloads 409
10399 The Use of TRIZ to Map the Evolutive Pattern of Products

Authors: Fernando C. Labouriau, Ricardo M. Naveiro

Abstract:

This paper presents a model for mapping the evolutive pattern of products in order to generate new ideas, to perceive emerging technologies and to manage product’s portfolios in new product development (NPD). According to the proposed model, the information extracted from the patent system is filtered and analyzed with TRIZ tools to produce the input information to the NPD process. The authors acknowledge that the NPD process is well integrated within the enterprises business strategic planning and that new products are vital in the competitive market nowadays. In the other hand, it has been observed the proactive use of patent information in some methodologies for selecting projects, mapping technological change and generating product concepts. And one of these methodologies is TRIZ, a theory created to favor innovation and to improve product design that provided the analytical framework for the model. Initially, it is presented an introduction to TRIZ mainly focused on the patterns of evolution of technical systems and its strategic uses, a brief and absolutely non-comprehensive description as the theory has several others tools being widely employed in technical and business applications. Then, it is introduced the model for mapping the products evolutive pattern with its three basic pillars, namely patent information, TRIZ and NPD, and the methodology for implementation. Following, a case study of a Brazilian bike manufacturing is presented to proceed the mapping of a product evolutive pattern by decomposing and analyzing one of its assemblies along ten evolution lines in order to envision opportunities for further product development. Some of these lines are illustrated in more details to evaluate the features of the product in relation to the TRIZ concepts using a comparison perspective with patents in the state of the art to validate the product’s evolutionary potential. As a result, the case study provided several opportunities for a product improvement development program in different project categories, identifying technical and business impacts as well as indicating the lines of evolution that can mostly benefit from each opportunity.

Keywords: product development, patents, product strategy, systems evolution

Procedia PDF Downloads 481
10398 Construction Contractor Pre-Qualification Using Multi-Attribute Utility Theory: A Multiplicative Approach

Authors: B. Vikram, Y. Anu Leena, Y. Anu Neena, M. V. Krishna Rao, V. S. S. Kumar

Abstract:

The industry is often criticized for inefficiencies in outcomes such as time and cost overruns, low productivity, poor quality and inadequate customer satisfaction. To enhance the chances for construction projects to be successful, selecting an able contractor is one of the fundamental decisions to be made by clients. The selection of the most appropriate contractor is a multi-criteria decision making (MCDM) process. In this paper, multi-attribute utility theory (MAUT) is employed utilizing the multiplicative form of utility function for ranking the prequalified contractors. Performance assessment criteria covering contracting company attributes, experience record, past performance, performance potential, financial stability and project specific criteria are considered for contractor evaluation. A case study of multistoried building for which four contractors submitted bids is considered to illustrate the applicability of multiplicative approach of MAUT to rank the prequalified contractors. The proposed MAUT decision making methodology can also be employed to other decision making situations.

Keywords: multi-attribute utility theory, construction industry, prequalification, contractor

Procedia PDF Downloads 421
10397 Foundations for Global Interactions: The Theoretical Underpinnings of Understanding Others

Authors: Randall E. Osborne

Abstract:

In a course on International Psychology, 8 theoretical perspectives (Critical Psychology, Liberation Psychology, Post-Modernism, Social Constructivism, Social Identity Theory, Social Reduction Theory, Symbolic Interactionism, and Vygotsky’s Sociocultural Theory) are used as a framework for getting students to understand the concept of and need for Globalization. One of critical psychology's main criticisms of conventional psychology is that it fails to consider or deliberately ignores the way power differences between social classes and groups can impact the mental and physical well-being of individuals or groups of people. Liberation psychology, also known as liberation social psychology or psicología social de la liberación, is an approach to psychological science that aims to understand the psychology of oppressed and impoverished communities by addressing the oppressive sociopolitical structure in which they exist. Postmodernism is largely a reaction to the assumed certainty of scientific, or objective, efforts to explain reality. It stems from a recognition that reality is not simply mirrored in human understanding of it, but rather, is constructed as the mind tries to understand its own particular and personal reality. Lev Vygotsky argued that all cognitive functions originate in, and must therefore be explained as products of social interactions and that learning was not simply the assimilation and accommodation of new knowledge by learners. Social Identity Theory discusses the implications of social identity for human interactions with and assumptions about other people. Social Identification Theory suggests people: (1) categorize—people find it helpful (humans might be perceived as having a need) to place people and objects into categories, (2) identify—people align themselves with groups and gain identity and self-esteem from it, and (3) compare—people compare self to others. Social reductionism argues that all behavior and experiences can be explained simply by the affect of groups on the individual. Symbolic interaction theory focuses attention on the way that people interact through symbols: words, gestures, rules, and roles. Meaning evolves from human their interactions in their environment and with people. Vygotsky’s sociocultural theory of human learning describes learning as a social process and the origination of human intelligence in society or culture. The major theme of Vygotsky’s theoretical framework is that social interaction plays a fundamental role in the development of cognition. This presentation will discuss how these theoretical perspectives are incorporated into a course on International Psychology, a course on the Politics of Hate, and a course on the Psychology of Prejudice, Discrimination and Hate to promote student thinking in a more ‘global’ manner.

Keywords: globalization, international psychology, society and culture, teaching interculturally

Procedia PDF Downloads 233
10396 On the Use of Reliability Factors to Reduce Conflict between Information Sources in Dempster-Shafer Theory

Authors: A. Alem, Y. Dahmani, A. Hadjali, A. Boualem

Abstract:

Managing the problem of the conflict, either by using the Dempster-Shafer theory, or by the application of the fusion process to push researchers in recent years to find ways to get to make best decisions especially; for information systems, vision, robotic and wireless sensor networks. In this paper we are interested to take account of the conflict in the combination step that took the conflict into account and tries to manage such a way that it does not influence the decision step, the conflict what from reliable sources. According to [1], the conflict lead to erroneous decisions in cases where was with strong degrees between sources of information, if the conflict is more than the maximum of the functions of belief mass K > max1...n (mi (A)), then the decision becomes impossible. We will demonstrate in this paper that the multiplication of mass functions by coefficients of reliability is a decreasing function; it leads to the reduction of conflict and a good decision. The definition of reliability coefficients accurately and multiply them by the mass functions of each information source to resolve the conflict and allow deciding whether the degree of conflict. The evaluation of this technique is done by a use case; a comparison of the combination of springs with a maximum conflict without, and with reliability coefficients.

Keywords: Dempster-Shafer theory, fusion process, conflict managing, reliability factors, decision

Procedia PDF Downloads 410
10395 The Effects of Normal Aging on Reasoning Ability: A Dual-Process Approach

Authors: Jamie A. Prowse Turner, Jamie I. D. Campbell, Valerie A. Thompson

Abstract:

The objective of the current research was to use a dual-process theory framework to explain these age-related differences in reasoning. Seventy-two older (M = 80.0 years) and 72 younger (M = 24.6 years) adults were given a variety of reasoning tests (i.e., a syllogistic task, base rate task, the Cognitive Reflection Test, and a perspective manipulation), as well as independent tests of capacity (working memory, processing speed, and inhibition), thinking styles, and metacognitive ability, to account for these age-related differences. It was revealed that age-related differences were limited to problems that required Type 2 processing and were related to differences in cognitive capacity, individual difference factors, and strategy choice. Furthermore, older adults’ performance can be improved by reasoning from another’s’ perspective and cannot, at this time, be explained by metacognitive differences between young and older adults. All of these findings fit well within a dual-process theory of reasoning, which provides an integrative framework accounting for previous findings and the findings presented in the current manuscript.

Keywords: aging, dual-process theory, performance, reasoning ability

Procedia PDF Downloads 176
10394 Moral Dilemmas, Difficulties in the Digital Games

Authors: YuPei Chang

Abstract:

In recent years, moral judgement tasks have served as an increasingly popular plot mechanism in digital gameplay. As a moral agency, the player's choice judgment in digital games is to shuttle between the real world and the game world. The purpose of the research is to explore the moral difficulties brewed by the interactive mechanism of the game and the moral choice of players. In the theoretical level, this research tries to combine moral disengagement, moral foundations theory, and gameplay as an aesthetic experience. And in the methodical level, this research tries to use methods that combine text analysis, diary method, and in-depth interviews. There are three research problems that will be solved in three stages. In the first stage, this project will explore how moral dilemmas are represented in game mechanics. In the second stage, this project will analyze the appearance and conflicts of moral dilemmas in game mechanics based on the five aspects of moral foundations theory. In the third stage, this project will try to understand the players' choices when they face the choices of moral dilemmas, as well as their explanations and reflections after making the decisions.

Keywords: morality, moral disengagement, moral foundations theory, PC game, gameplay, moral dilemmas, player

Procedia PDF Downloads 64
10393 X-Ray Fluorescence Molecular Imaging with Improved Sensitivity for Biomedical Applications

Authors: Guohua Cao, Xu Dong

Abstract:

X-ray Fluorescence Molecular Imaging (XFMI) holds great promise as a low-cost molecular imaging modality for biomedical applications with high chemical sensitivity. However, for in vivo biomedical applications, a key technical bottleneck is the relatively low chemical sensitivity of XFMI, especially at a reasonably low radiation dose. In laboratory x-ray source based XFMI, one of the main factors that limits the chemical sensitivity of XFMI is the scattered x-rays. We will present our latest findings on improving the chemical sensitivity of XFMI using excitation beam spectrum optimization. XFMI imaging experiments on two mouse-sized phantoms were conducted at three different excitation beam spectra. Our results show that the minimum detectable concentration (MDC) of iodine can be readily increased by five times via excitation spectrum optimization. Findings from this investigation could find use for in vivo pre-clinical small-animal XFMI in the future.

Keywords: molecular imaging, X-ray fluorescence, chemical sensitivity, X-ray scattering

Procedia PDF Downloads 169
10392 Lean Commercialization: A New Dawn for Commercializing High Technologies

Authors: Saheed A. Gbadegeshin

Abstract:

Lean Commercialization (LC) is a transformation of new technologies and knowledge to products and services through application of lean/agile principle. This principle focuses on how resources can be minimized on development, manufacturing, and marketing new products/services, which can be accepted by customers. To understand how the LC has been employed by the technology-based companies, a case study approach was employed by interviewing the founders, observing their high technologies, and interviewing the commercialization experts. Two serial entrepreneurs were interviewed in 2012, and their commercialized technologies were monitored from 2012 till 2016. Some results were collected, but to validate the commercialization strategies of these entrepreneurs, four commercialization experts were interviewed in 2017. Initial results, observation notes, and experts’ opinions were analyzed qualitatively. The final findings showed that the entrepreneurs applied the LC unknowingly, and the experts were aware of the LC. Similarly, the entrepreneurs used the LC due to the financial constraints, and their need for success. Additionally, their commercialization practices revealed that LC appeared to be one of their commercialization strategies. Thus, their practices were analyzed, and a framework was developed. Furthermore, the experts noted that LC is a new dawn, which technologists and scientists need to consider for their high technology commercialization. This article contributes to the theory and practice of commercialization. Theoretically, the framework adds value to the commercialization discussion. And, practically the framework can be used by the technology entrepreneurs (technologists and scientists), technology-based enterprises, and technology entrepreneurship educators as a guide in their commercialization adventures.

Keywords: lean commercialization, high technologies, lean start-up, technology-based companies

Procedia PDF Downloads 146
10391 The Searching Artificial Intelligence: Neural Evidence on Consumers' Less Aversion to Algorithm-Recommended Search Product

Authors: Zhaohan Xie, Yining Yu, Mingliang Chen

Abstract:

As research has shown a convergent tendency for aversion to AI recommendation, it is imperative to find a way to promote AI usage and better harness the technology. In the context of e-commerce, this study has found evidence that people show less avoidance of algorithms when recommending search products compared to experience products. This is due to people’s different attribution of mind to AI versus humans, as suggested by mind perception theory. While people hold the belief that an algorithm owns sufficient capability to think and calculate, which makes it competent to evaluate search product attributes that can be obtained before actual use, they doubt its capability to sense and feel, which is essential for evaluating experience product attributes that must be assessed after experience in person. The result of the behavioral investigation (Study 1, N=112) validated that consumers show low purchase intention to experience products recommended by AI. Further consumer neuroscience study (Study 2, N=26) using Event-related potential (ERP) showed that consumers have a higher level of cognitive conflict when faced with AI recommended experience product as reflected by larger N2 component, while the effect disappears for search product. This research has implications for the effective employment of AI recommenders, and it extends the literature on e-commerce and marketing communication.

Keywords: algorithm recommendation, consumer behavior, e-commerce, event-related potential, experience product, search product

Procedia PDF Downloads 117
10390 Iontophoretic Drug Transport: An Non-Invasive Transdermal Approach

Authors: Ashish Jain, Shivam Tayal

Abstract:

There has been great interest in the field of Iontophoresis since few years due to its great applications in the field of controlled transdermal drug delivery system. It is an technique which is used to enhance the transdermal permeation of ionized high molecular weight molecules across the skin membrane especially Peptides & Proteins by the application of direct current of 1-4 mA for 20-40 minutes whereas chemical must be placed on electrodes with same charge. Iontophoresis enhanced the delivery of drug into the skin via pores like hair follicles, sweat gland ducts etc. rather than through stratum corneum. It has wide applications in the field of experimental, Therapeutic, Diagnostic, Dentistry etc. Medical science is using it to treat Hyperhidrosis (Excessive sweating) in hands and feet and to treat other ailments like hypertension, Migraine etc. Nowadays commercial transdermal iontophoretic patches are available in the market to treat different ailments. Researchers are keen to research in this field due to its vast applications and advantages.

Keywords: iontophoresis, novel drug delivery, transdermal, permeation enhancer

Procedia PDF Downloads 237