Search results for: epistemological functions
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 2505

Search results for: epistemological functions

2055 Smartphone Addiction and Reaction Time in Geriatric Population

Authors: Anjali N. Shete, G. D. Mahajan, Nanda Somwanshi

Abstract:

Context: Smartphones are the new generation of mobile phones; they have emerged over the last few years. Technology has developed so much that it has become part of our life and mobile phones are one of them. These smartphones are equipped with the capabilities to display photos, play games, watch videos and navigation, etc. The advances have a huge impact on many walks of life. The adoption of new technology has been challenging for the elderly. But, the elder population is also moving towards digitally connected lives. As age advances, there is a decline in the motor and cognitive functions of the brain, and hence the reaction time is affected. The study was undertaken to assess the usefulness of smartphones in improving cognitive functions. Aims and Objectives: The aim of the study was to observe the effects of smartphone addiction on reaction time in elderly population Material and Methods: This is an experimental study. 100 elderly subjects were enrolled in this study randomly from urban areas. They all were using smartphones for several hours a day. They were divided into two groups according to the scores of the mobile phone addiction scale (MPAS). Simple reaction time was estimated by the Ruler drop method. The reaction time was then calculated for each subject in both groups. The data were analyzed using mean, standard deviation, and Pearson correlation test. Results: The mean reaction time in Group A is 0.27+ 0.040 and in Group B is 0.20 + 0.032. The values show a statistically significant change in reaction time. Conclusion: Group A with a high MPAS score has a low reaction time compared to Group B with a low MPAS score. Hence, it can be concluded that the use of smartphones in the elderly is useful, delaying the neurological decline, and smarten the brain.

Keywords: smartphones, MPAS, reaction time, elderly population

Procedia PDF Downloads 155
2054 Intonation Salience as an Underframe to Text Intonation Models

Authors: Tatiana Stanchuliak

Abstract:

It is common knowledge that intonation is not laid over a ready text. On the contrary, intonation forms and accompanies the text on the level of its birth in the speaker’s mind. As a result, intonation plays one of the fundamental roles in the process of transferring a thought into external speech. Intonation structure can highlight the semantic significance of textual elements and become a ranging mark in understanding the information structure of the text. Intonation functions by means of prosodic characteristics, one of which is intonation salience, whose function in texts results in making some textual elements more prominent than others. This function of intonation, therefore, performs as organizing. It helps to form the frame of key elements of the text. The study under consideration made an attempt to look into the inner nature of salience and create a sort of a text intonation model. This general goal brought to some more specific intermediate results. First, there were established degrees of salience on the level of the smallest semantic element - intonation group, as well as prosodic means of creating salience, were examined. Second, the most frequent combinations of prosodic means made it possible to distinguish patterns of salience, which then became constituent elements of a text intonation model. Third, the analysis of the predicate structure allowed to divide the whole text into smaller parts, or units, which performed a specific function in the developing of the general communicative intention. It appeared that such units can be found in any text and they have common characteristics of their intonation arrangement. These findings are certainly very important both for the theory of intonation and their practical application.

Keywords: accentuation , inner speech, intention, intonation, intonation functions, models, patterns, predicate, salience, semantics, sentence stress, text

Procedia PDF Downloads 241
2053 Examining Risk Based Approach to Financial Crime in the Charity Sector: The Challenges and Solutions, Evidence from the Regulation of Charities in England and Wales

Authors: Paschal Ohalehi

Abstract:

Purpose - The purpose of this paper, which is part of a PhD thesis is to examine the role of risk based approach in minimising financial crime in the charity sector as well as offer recommendations to improving the quality of charity regulation whilst still retaining risk based approach as a regulatory framework and also making a case for a new regulatory model. Increase in financial crimes in the charity sector has put the role of regulation in minimising financial crime up for debates amongst researchers and practitioners. Although previous research has addressed the regulation of charities, research on the role of risk based approach to minimising financial crime in the charity sector is limited. Financial crime is a concern for all organisation including charities. Design/methodology/approach - This research adopts a social constructionist’s epistemological position. This research is carried out using semi structured in-depth interviews amongst randomly selected 24 charity trustees divided into three classes: 10 small charities, 10 medium charities and 4 large charities. The researcher also interviewed 4 stakeholders (NFA, Charity Commission and two different police forces in terms of size and area of coverage) in the charity sector. Findings - The results of this research show that reliance on risk based approach to financial crime in the sector is weak and fragmented with the research pointing to a clear evidence of disconnect between the regulator and the regulated leading to little or lack of regulation of trustees’ activities, limited monitoring of charities and lack of training and awareness on financial crime in the sector. Originality – This paper shows how regulation of charities in general and risk based approach in particular can be improved in order to meet the expectations of the stakeholders, the public, the regulator and the regulated.

Keywords: risk, risk based approach, financial crime, fraud, self-regulation

Procedia PDF Downloads 354
2052 Practical Challenges of Tunable Parameters in Matlab/Simulink Code Generation

Authors: Ebrahim Shayesteh, Nikolaos Styliaras, Alin George Raducu, Ozan Sahin, Daniel Pombo VáZquez, Jonas Funkquist, Sotirios Thanopoulos

Abstract:

One of the important requirements in many code generation projects is defining some of the model parameters tunable. This helps to update the model parameters without performing the code generation again. This paper studies the concept of embedded code generation by MATLAB/Simulink coder targeting the TwinCAT Simulink system. The generated runtime modules are then tested and deployed to the TwinCAT 3 engineering environment. However, defining the parameters tunable in MATLAB/Simulink code generation targeting TwinCAT is not very straightforward. This paper focuses on this subject and reviews some of the techniques tested here to make the parameters tunable in generated runtime modules. Three techniques are proposed for this purpose, including normal tunable parameters, callback functions, and mask subsystems. Moreover, some test Simulink models are developed and used to evaluate the results of proposed approaches. A brief summary of the study results is presented in the following. First of all, the parameters defined tunable and used in defining the values of other Simulink elements (e.g., gain value of a gain block) could be changed after the code generation and this value updating will affect the values of all elements defined based on the values of the tunable parameter. For instance, if parameter K=1 is defined as a tunable parameter in the code generation process and this parameter is used to gain a gain block in Simulink, the gain value for the gain block is equal to 1 in the gain block TwinCAT environment after the code generation. But, the value of K can be changed to a new value (e.g., K=2) in TwinCAT (without doing any new code generation in MATLAB). Then, the gain value of the gain block will change to 2. Secondly, adding a callback function in the form of “pre-load function,” “post-load function,” “start function,” and will not help to make the parameters tunable without performing a new code generation. This means that any MATLAB files should be run before performing the code generation. The parameters defined/calculated in this file will be used as fixed values in the generated code. Thus, adding these files as callback functions to the Simulink model will not make these parameters flexible since the MATLAB files will not be attached to the generated code. Therefore, to change the parameters defined/calculated in these files, the code generation should be done again. However, adding these files as callback functions forces MATLAB to run them before the code generation, and there is no need to define the parameters mentioned in these files separately. Finally, using a tunable parameter in defining/calculating the values of other parameters through the mask is an efficient method to change the value of the latter parameters after the code generation. For instance, if tunable parameter K is used in calculating the value of two other parameters K1 and K2 and, after the code generation, the value of K is updated in TwinCAT environment, the value of parameters K1 and K2 will also be updated (without any new code generation).

Keywords: code generation, MATLAB, tunable parameters, TwinCAT

Procedia PDF Downloads 204
2051 The Relationship between Life Event Stress, Depressive Thoughts, and Working Memory Capacity

Authors: Eid Abo Hamza, Ahmed Helal

Abstract:

Purpose: The objective is to measure the capacity of the working memory, ie. the maximum number of elements that can be retrieved and processed, by measuring the basic functions of working memory (inhibition/transfer/update), and also to investigate its relationship to life stress and depressive thoughts. Methods: The study sample consisted of 50 students from Egypt. A cognitive task was designed to measure the working memory capacity based on the determinants found in previous research, which showed that cognitive tasks are the best measurements of the functions and capacity of working memory. Results: The results indicated that there were statistically significant differences in the level of life stress events (high/low) on the task of measuring the working memory capacity. The results also showed that there were no statistically significant differences between males and females or between academic major on the task of measuring the working memory capacity. Furthermore, the results reported that there was no statistically significant effect of the interaction of the level of life stress (high/low) and gender (male/female) on the task of measuring working memory capacity. Finally, the results showed that there were significant differences in the level of depressive thoughts (high/low) on the task of measuring working memory. Conclusions: The current research concludes that neither the interaction of stressful life events, gender, and academic major, nor the interaction of depressive thoughts, gender, and academic major, influence on working memory capacity.

Keywords: working memory, depression, stress, life event

Procedia PDF Downloads 132
2050 Allium Cepa Extract Provides Neuroprotection Against Ischemia Reperfusion Induced Cognitive Dysfunction and Brain Damage in Mice

Authors: Jaspal Rana, Alkem Laboratories, Baddi, Himachal Pradesh, India Chitkara University, Punjab, India

Abstract:

Oxidative stress has been identified as an underlying cause of ischemia-reperfusion (IR) related cognitive dysfunction and brain damage. Therefore, antioxidant based therapies to treat IR injury are being investigated. Allium cepa L. (onion) is used as culinary medicine and is documented to have marked antioxidant effects. Hence, the present study was designed to evaluate the effect of A. cepa outer scale extract (ACE) against IR induced cognition and biochemical deficit in mice. ACE was prepared by maceration with 70% methanol and fractionated into ethylacetate and aqueous fractions. Bilateral common carotid artery occlusion for 10 min followed by 24 h reperfusion was used to induce cerebral IR injury. Following IR injury, ACE (100 and 200 mg/kg) was administered orally to animals for 7 days once daily. Behavioral outcomes (memory and sensorimotor functions) were evaluated using Morris water maze and neurological severity score. Cerebral infarct size, brain thiobarbituric acid reactive species, reduced glutathione, and superoxide dismutase activity was also determined. Treatment with ACE significantly ameliorated IR mediated deterioration of memory and sensorimotor functions and rise in brain oxidative stress in animals. The results of the present investigation revealed that ACE improved functional outcomes after cerebral IR injury, which may be attributed to its antioxidant properties.

Keywords: stroke, neuroprotection, ischemia reperfusion, herbal drugs

Procedia PDF Downloads 84
2049 Implementation of Conceptual Real-Time Embedded Functional Design via Drive-By-Wire ECU Development

Authors: Ananchai Ukaew, Choopong Chauypen

Abstract:

Design concepts of real-time embedded system can be realized initially by introducing novel design approaches. In this literature, model based design approach and in-the-loop testing were employed early in the conceptual and preliminary phase to formulate design requirements and perform quick real-time verification. The design and analysis methodology includes simulation analysis, model based testing, and in-the-loop testing. The design of conceptual drive-by-wire, or DBW, algorithm for electronic control unit, or ECU, was presented to demonstrate the conceptual design process, analysis, and functionality evaluation. The concepts of DBW ECU function can be implemented in the vehicle system to improve electric vehicle, or EV, conversion drivability. However, within a new development process, conceptual ECU functions and parameters are needed to be evaluated. As a result, the testing system was employed to support conceptual DBW ECU functions evaluation. For the current setup, the system components were consisted of actual DBW ECU hardware, electric vehicle models, and control area network or CAN protocol. The vehicle models and CAN bus interface were both implemented as real-time applications where ECU and CAN protocol functionality were verified according to the design requirements. The proposed system could potentially benefit in performing rapid real-time analysis of design parameters for conceptual system or software algorithm development.

Keywords: drive-by-wire ECU, in-the-loop testing, model-based design, real-time embedded system

Procedia PDF Downloads 332
2048 Redefining “Minor”: An Empirical Research on Two Biennials in Contemporary China

Authors: Mengwei Li

Abstract:

Since the 1990s, biennials, and large-scale transnational art exhibitions, have proliferated exponentially across the globe, particularly in Asia, Africa, and Latin America. It has spurred debates regarding the inclusion of "new art cultures" and the deconstruction of the mechanism of exclusion embedded in the Western monopoly on art. Hans Belting introduced the concept of "global art" in 2013 to denounce the West's privileged canons in art by emphasising the inclusion of art practices from alleged non-Western regions. Arguably, the rise of new biennial networks developed by these locations has contributed to the asserted "inclusion of new art worlds." However, phrases such as "non-Western" and "beyond Euro-American" attached to these discussions raise the question of non- or beyond- in relation to whom. In this narrative, to become "integrated" and "equal" implies entry into the "core," a universal system in which preexisting authoritative voices define "newcomers" by what they are not. Possibly, if there is a global biennial system that symbolises a "universal language" of the contemporary art world, it is centered on the inherently dynamic yet asymmetrical interaction and negotiation between the "core" and the rest of the world's "periphery." Engaging with theories of "minor literature" developed by Deleuze and Guattari, this research proposes an epistemological framework to comprehend the global biennial discourse since the 1990s. Using this framework, this research looks at two biennial models in China: the 13th Shanghai Biennale, which was organised in the country's metropolitan art centre, and the 2nd Yinchuan Biennale, which was inaugurated in a geographically and economically marginalised city compared to domestic centres. By analysing how these two biennials from different locations in China positioned themselves and conveyed their local profiles through the universal language of the biennial, this research identifies a potential "minor" positionality within the global biennial discourse from China's perspective.

Keywords: biennials, China, contemporary, global art, minor literature

Procedia PDF Downloads 67
2047 Rethinking Riba in an Agency Theoretic Framework: Islamic Banking and Finance beyond Sophistry

Authors: Muhammad Arsalan

Abstract:

The efficiency of a financial intermediation system is assessed by its ability to achieve allocative efficiency, asset transformation, and the subsequent economic development. Islamic Banking and Finance (IBF) was conceived to serve as an alternate financial intermediation system adherent to the injunctions of Islam. A critical appraisal of the state of contemporary IBF reveals that it neither fulfills the aspirations of Islamic rhetoric nor is efficient in terms of asset transformation and economic development. This paper is an intuitive pursuit to explore the economic rationale of established principles of IBF, and the reasons of the persistent divergence of IBF being accused of ruses and sophistry. Disentangling the varying viewpoints, the underdevelopment of IBF has been attributed to misinterpretation of Riba, which has been explicated through a narrow fiqhi and legally deterministic approach. It presents a critical account of how incorrect conceptualization of the key injunction on Riba, steered flawed institutionalization of an Islamic Financial intermediation system. It also emphasizes on the wrong interpretation of the ontological and epistemological sources of Islamic Law (primarily Riba), that explains the perennial economic underdevelopment of the Muslim world. Deeming ‘a collaborative and dynamic Ijtihad’ as the elixir, this paper insists on the exigency of redefining Riba, i.e., a definition that incorporates the modern modes of economic cooperation and the contemporary financial intermediation ecosystem. Finally, Riba has been articulated in an agency theoretic framework to eschew expropriation of wealth, and assure protection of property rights, aimed at realizing the twin goals of a) Shari’ah adherence in true spirit, b) financial and economic development of the Muslim world.

Keywords: agency theory, financial intermediation, Islamic banking and finance, ijtihad, economic development, Riba, information asymmetry

Procedia PDF Downloads 113
2046 A Rhetorical Approach to Julian the Emperor: A Consolation upon the Departure of the Excellent Sallust

Authors: Georgios Alexandropoulos

Abstract:

This study examines the rhetorical practice of "The consolation to himself upon the departure of the excellent Sallust" written by Flavius Claudius Julian the emperor. Its purpose is to describe the way that Julian uses the language as to have favorable effects on public through certain communicative and rhetorical functions.

Keywords: discourse analysis, Byzantine rhetoric,

Procedia PDF Downloads 393
2045 Spatial Interpolation of Aerosol Optical Depth Pollution: Comparison of Methods for the Development of Aerosol Distribution

Authors: Sahabeh Safarpour, Khiruddin Abdullah, Hwee San Lim, Mohsen Dadras

Abstract:

Air pollution is a growing problem arising from domestic heating, high density of vehicle traffic, electricity production, and expanding commercial and industrial activities, all increasing in parallel with urban population. Monitoring and forecasting of air quality parameters are important due to health impact. One widely available metric of aerosol abundance is the aerosol optical depth (AOD). The AOD is the integrated light extinction coefficient over a vertical atmospheric column of unit cross section, which represents the extent to which the aerosols in that vertical profile prevent the transmission of light by absorption or scattering. Seasonal aerosol optical depth (AOD) values at 550 nm derived from the Moderate Resolution Imaging Spectroradiometer (MODIS) sensor onboard NASA’s Terra satellites, for the 10 years period of 2000-2010 were used to test 7 different spatial interpolation methods in the present study. The accuracy of estimations was assessed through visual analysis as well as independent validation based on basic statistics, such as root mean square error (RMSE) and correlation coefficient. Based on the RMSE and R values of predictions made using measured values from 2000 to 2010, Radial Basis Functions (RBFs) yielded the best results for spring, summer, and winter and ordinary kriging yielded the best results for fall.

Keywords: aerosol optical depth, MODIS, spatial interpolation techniques, Radial Basis Functions

Procedia PDF Downloads 383
2044 Beyond Baudrillard: A Critical Intersection between Semiotics and Materialism

Authors: Francesco Piluso

Abstract:

Nowadays, to restore the deconstructive power of semiotics implies a critical analysis of neoliberal ideology, and, even more critically, a confrontation with materialist perspective. The theoretical path of Jean Baudrillard is crucial to understand the ambivalence of this intersection. A semiotic critique of Baudrillard’s work, through tools of both structuralism and interpretative semiotics, has the aim to give materialism a new consistent semiotic approach and vice-versa. According to Baudrillard, the commodity form is characterized by the same abstract and systemic logic of the sign-form, in which the production of the signified (use-value) is a mere ideological mean for the reproduction of the signifiers-chain (exchange-value). Nevertheless, this parallelism is broken by the author himself: if the use-value is deconstructed in its relative logic, the signified and the referent, both as discrete and positive elements, are collapsed on the same plane at the shadows of the signified forms. These divergent considerations lead Baudrillard to the same crucial point: the dismissal of the material world, replaced by the hyperreality as reproduction of a semiotic (genetic) Code. The stress on the concept of form, as an epistemological and semiotic tool to analyse the construction of values in the consumer society, has led to the Code as its ontological drift. In other words, Baudrillard seems to enclose consumer society (and reality) in this immanent and self-fetishized world of signs–an ideological perspective that mystifies the gravity of the material relationships between Northern-Western World and Third World. The notion of Encyclopaedia by Umberto Eco is the key to overturn the relationship of immanence/transcendence between the Code and the economic political of the sign, by understanding the former as an ideological plane within the encyclopedia itself. Therefore, rather than building semiotic (hyper)realities, semiotics has to deal with materialism in terms of material relationships of power which are mystified and reproduced through such ideological ontologies of signs.

Keywords: Baudrillard, Code, Eco, Encyclopaedia, epistemology vs. ontology, semiotics vs. materialism

Procedia PDF Downloads 136
2043 Development of a Roadmap for Assessment the Sustainability of Buildings in Saudi Arabia Using Building Information Modeling

Authors: Ibrahim A. Al-Sulaihi, Khalid S. Al-Gahtani, Abdullah M. Al-Sugair, Aref A. Abadel

Abstract:

Achieving environmental sustainability is one of the important issues considered in many countries’ vision. Green/Sustainable building is widely used terminology for describing a friendly environmental construction. Applying sustainable practices has a significant importance in various fields, including construction field that consumes an enormous amount of resource and causes a considerable amount of waste. The need for sustainability is increased in the regions that suffering from the limitation of natural resource and extreme weather conditions such as Saudi Arabia. Since buildings designs are getting sophisticated, the need for tools, which support decision-making for sustainability issues, is increasing, especially in the design and preconstruction stages. In this context, Building Information Modeling (BIM) can aid in performing complex building performance analyses to ensure an optimized sustainable building design. Accordingly, this paper introduces a roadmap towards developing a systematic approach for presenting the sustainability of buildings using BIM. The approach includes set of main processes including; identifying the sustainability parameters that can be used for sustainability assessment in Saudi Arabia, developing sustainability assessment method that fits the special circumstances in the Kingdom, identifying the sustainability requirements and BIM functions that can be used for satisfying these requirements, and integrating these requirements with identified functions. As a result, the sustainability-BIM approach can be developed which helps designers in assessing the sustainability and exploring different design alternatives at the early stage of the construction project.

Keywords: green buildings, sustainability, BIM, rating systems, environment, Saudi Arabia

Procedia PDF Downloads 358
2042 An Attempt at the Multi-Criterion Classification of Small Towns

Authors: Jerzy Banski

Abstract:

The basic aim of this study is to discuss and assess different classifications and research approaches to small towns that take their social and economic functions into account, as well as relations with surrounding areas. The subject literature typically includes three types of approaches to the classification of small towns: 1) the structural, 2) the location-related, and 3) the mixed. The structural approach allows for the grouping of towns from the point of view of the social, cultural and economic functions they discharge. The location-related approach draws on the idea of there being a continuum between the center and the periphery. A mixed classification making simultaneous use of the different approaches to research brings the most information to bear in regard to categories of the urban locality. Bearing in mind the approaches to classification, it is possible to propose a synthetic method for classifying small towns that takes account of economic structure, location and the relationship between the towns and their surroundings. In the case of economic structure, the small centers may be divided into two basic groups – those featuring a multi-branch structure and those that are specialized economically. A second element of the classification reflects the locations of urban centers. Two basic types can be identified – the small town within the range of impact of a large agglomeration, or else the town outside such areas, which is to say located peripherally. The third component of the classification arises out of small towns’ relations with their surroundings. In consequence, it is possible to indicate 8 types of small-town: from local centers enjoying good accessibility and a multi-branch economic structure to peripheral supra-local centers characterised by a specialized economic structure.

Keywords: small towns, classification, functional structure, localization

Procedia PDF Downloads 164
2041 Parameter Identification Analysis in the Design of Rock Fill Dams

Authors: G. Shahzadi, A. Soulaimani

Abstract:

This research work aims to identify the physical parameters of the constitutive soil model in the design of a rockfill dam by inverse analysis. The best parameters of the constitutive soil model, are those that minimize the objective function, defined as the difference between the measured and numerical results. The Finite Element code (Plaxis) has been utilized for numerical simulation. Polynomial and neural network-based response surfaces have been generated to analyze the relationship between soil parameters and displacements. The performance of surrogate models has been analyzed and compared by evaluating the root mean square error. A comparative study has been done based on objective functions and optimization techniques. Objective functions are categorized by considering measured data with and without uncertainty in instruments, defined by the least square method, which estimates the norm between the predicted displacements and the measured values. Hydro Quebec provided data sets for the measured values of the Romaine-2 dam. Stochastic optimization, an approach that can overcome local minima, and solve non-convex and non-differentiable problems with ease, is used to obtain an optimum value. Genetic Algorithm (GA), Particle Swarm Optimization (PSO) and Differential Evolution (DE) are compared for the minimization problem, although all these techniques take time to converge to an optimum value; however, PSO provided the better convergence and best soil parameters. Overall, parameter identification analysis could be effectively used for the rockfill dam application and has the potential to become a valuable tool for geotechnical engineers for assessing dam performance and dam safety.

Keywords: Rockfill dam, parameter identification, stochastic analysis, regression, PLAXIS

Procedia PDF Downloads 117
2040 Approximation by Generalized Lupaş-Durrmeyer Operators with Two Parameter α and β

Authors: Preeti Sharma

Abstract:

This paper deals with the Stancu type generalization of Lupaş-Durrmeyer operators. We establish some direct results in the polynomial weighted space of continuous functions defined on the interval [0, 1]. Also, Voronovskaja type theorem is studied.

Keywords: Lupas-Durrmeyer operators, polya distribution, weighted approximation, rate of convergence, modulus of continuity

Procedia PDF Downloads 321
2039 Household Wealth and Portfolio Choice When Tail Events Are Salient

Authors: Carlson Murray, Ali Lazrak

Abstract:

Robust experimental evidence of systematic violations of expected utility (EU) establishes that individuals facing risk overweight utility from low probability gains and losses when making choices. These findings motivated development of models of preferences with probability weighting functions, such as rank dependent utility (RDU). We solve for the optimal investing strategy of an RDU investor in a dynamic binomial setting from which we derive implications for investing behavior. We show that relative to EU investors with constant relative risk aversion, commonly measured probability weighting functions produce optimal RDU terminal wealth with significant downside protection and upside exposure. We additionally find that in contrast to EU investors, RDU investors optimally choose a portfolio that contains fair bets that provide payo↵s that can be interpreted as lottery outcomes or exposure to idiosyncratic returns. In a calibrated version of the model, we calculate that RDU investors would be willing to pay 5% of their initial wealth for the freedom to trade away from an optimal EU wealth allocation. The dynamic trading strategy that supports the optimal wealth allocation implies portfolio weights that are independent of initial wealth but requires higher risky share after good stock return histories. Optimal trading also implies the possibility of non-participation when historical returns are poor. Our model fills a gap in the literature by providing new quantitative and qualitative predictions that can be tested experimentally or using data on household wealth and portfolio choice.

Keywords: behavioral finance, probability weighting, portfolio choice

Procedia PDF Downloads 401
2038 Stochastic Optimization of a Vendor-Managed Inventory Problem in a Two-Echelon Supply Chain

Authors: Bita Payami-Shabestari, Dariush Eslami

Abstract:

The purpose of this paper is to develop a multi-product economic production quantity model under vendor management inventory policy and restrictions including limited warehouse space, budget, and number of orders, average shortage time and maximum permissible shortage. Since the “costs” cannot be predicted with certainty, it is assumed that data behave under uncertain environment. The problem is first formulated into the framework of a bi-objective of multi-product economic production quantity model. Then, the problem is solved with three multi-objective decision-making (MODM) methods. Then following this, three methods had been compared on information on the optimal value of the two objective functions and the central processing unit (CPU) time with the statistical analysis method and the multi-attribute decision-making (MADM). The results are compared with statistical analysis method and the MADM. The results of the study demonstrate that augmented-constraint in terms of optimal value of the two objective functions and the CPU time perform better than global criteria, and goal programming. Sensitivity analysis is done to illustrate the effect of parameter variations on the optimal solution. The contribution of this research is the use of random costs data in developing a multi-product economic production quantity model under vendor management inventory policy with several constraints.

Keywords: economic production quantity, random cost, supply chain management, vendor-managed inventory

Procedia PDF Downloads 104
2037 Chronic Cognitive Impacts of Mild Traumatic Brain Injury during Aging

Authors: Camille Charlebois-Plante, Marie-Ève Bourassa, Gaelle Dumel, Meriem Sabir, Louis De Beaumont

Abstract:

To the extent of our knowledge, there has been little interest in the chronic effects of mild traumatic brain injury (mTBI) on cognition during normal aging. This is rather surprising considering the impacts on daily and social functioning. In addition, sustaining a mTBI during late adulthood may increase the effect of normal biological aging in individuals who consider themselves normal and healthy. The objective of this study was to characterize the persistent neuropsychological repercussions of mTBI sustained during late adulthood, on average 12 months prior to testing. To this end, 35 mTBI patients and 42 controls between the ages of 50 and 69 completed an exhaustive neuropsychological assessment lasting three hours. All mTBI patients were asymptomatic and all participants had a score ≥ 27 at the MoCA. The evaluation consisted of 20 standardized neuropsychological tests measuring memory, attention, executive and language functions, as well as information processing speed. Performance on tests of visual (Brief Visuospatial Memory Test Revised) and verbal memory (Rey Auditory Verbal Learning Test and WMS-IV Logical Memory subtest), lexical access (Boston Naming Test) and response inhibition (Stroop) revealed to be significantly lower in the mTBI group. These findings suggest that a mTBI sustained during late adulthood induces lasting effects on cognitive function. Episodic memory and executive functions seem to be particularly vulnerable to enduring mTBI effects.

Keywords: cognitive function, late adulthood, mild traumatic brain injury, neuropsychology

Procedia PDF Downloads 149
2036 Enhancement of Mass Transport and Separations of Species in a Electroosmotic Flow by Distinct Oscillatory Signals

Authors: Carlos Teodoro, Oscar Bautista

Abstract:

In this work, we analyze theoretically the mass transport in a time-periodic electroosmotic flow through a parallel flat plate microchannel under different periodic functions of the applied external electric field. The microchannel connects two reservoirs having different constant concentrations of an electro-neutral solute, and the zeta potential of the microchannel walls are assumed to be uniform. The governing equations that allow determining the mass transport in the microchannel are given by the Poisson-Boltzmann equation, the modified Navier-Stokes equations, where the Debye-Hückel approximation is considered (the zeta potential is less than 25 mV), and the species conservation. These equations are nondimensionalized and four dimensionless parameters appear which control the mass transport phenomenon. In this sense, these parameters are an angular Reynolds, the Schmidt and the Péclet numbers, and an electrokinetic parameter representing the ratio of the half-height of the microchannel to the Debye length. To solve the mathematical model, first, the electric potential is determined from the Poisson-Boltzmann equation, which allows determining the electric force for various periodic functions of the external electric field expressed as Fourier series. In particular, three different excitation wave forms of the external electric field are assumed, a) sawteeth, b) step, and c) a periodic irregular functions. The periodic electric forces are substituted in the modified Navier-Stokes equations, and the hydrodynamic field is derived for each case of the electric force. From the obtained velocity fields, the species conservation equation is solved and the concentration fields are found. Numerical calculations were done by considering several binary systems where two dilute species are transported in the presence of a carrier. It is observed that there are different angular frequencies of the imposed external electric signal where the total mass transport of each species is the same, independently of the molecular diffusion coefficient. These frequencies are called crossover frequencies and are obtained graphically at the intersection when the total mass transport is plotted against the imposed frequency. The crossover frequencies are different depending on the Schmidt number, the electrokinetic parameter, the angular Reynolds number, and on the type of signal of the external electric field. It is demonstrated that the mass transport through the microchannel is strongly dependent on the modulation frequency of the applied particular alternating electric field. Possible extensions of the analysis to more complicated pulsation profiles are also outlined.

Keywords: electroosmotic flow, mass transport, oscillatory flow, species separation

Procedia PDF Downloads 197
2035 Identification Algorithm of Critical Interface, Modelling Perils on Critical Infrastructure Subjects

Authors: Jiří. J. Urbánek, Hana Malachová, Josef Krahulec, Jitka Johanidisová

Abstract:

The paper deals with crisis situations investigation and modelling within the organizations of critical infrastructure. Every crisis situation has an origin in the emergency event occurrence in the organizations of energetic critical infrastructure especially. Here, the emergency events can be both the expected events, then crisis scenarios can be pre-prepared by pertinent organizational crisis management authorities towards their coping or the unexpected event (Black Swan effect) – without pre-prepared scenario, but it needs operational coping of crisis situations as well. The forms, characteristics, behaviour and utilization of crisis scenarios have various qualities, depending on real critical infrastructure organization prevention and training processes. An aim is always better organizational security and continuity obtainment. This paper objective is to find and investigate critical/ crisis zones and functions in critical situations models of critical infrastructure organization. The DYVELOP (Dynamic Vector Logistics of Processes) method is able to identify problematic critical zones and functions, displaying critical interfaces among actors of crisis situations on the DYVELOP maps named Blazons. Firstly, for realization of this ability is necessary to derive and create identification algorithm of critical interfaces. The locations of critical interfaces are the flags of crisis situation in real organization of critical infrastructure. Conclusive, the model of critical interface will be displayed at real organization of Czech energetic crisis infrastructure subject in Black Out peril environment. The Blazons need live power Point presentation for better comprehension of this paper mission.

Keywords: algorithm, crisis, DYVELOP, infrastructure

Procedia PDF Downloads 387
2034 Forecasting Market Share of Electric Vehicles in Taiwan Using Conjoint Models and Monte Carlo Simulation

Authors: Li-hsing Shih, Wei-Jen Hsu

Abstract:

Recently, the sale of electrical vehicles (EVs) has increased dramatically due to maturing technology development and decreasing cost. Governments of many countries have made regulations and policies in favor of EVs due to their long-term commitment to net zero carbon emissions. However, due to uncertain factors such as the future price of EVs, forecasting the future market share of EVs is a challenging subject for both the auto industry and local government. This study tries to forecast the market share of EVs using conjoint models and Monte Carlo simulation. The research is conducted in three phases. (1) A conjoint model is established to represent the customer preference structure on purchasing vehicles while five product attributes of both EV and internal combustion engine vehicles (ICEV) are selected. A questionnaire survey is conducted to collect responses from Taiwanese consumers and estimate the part-worth utility functions of all respondents. The resulting part-worth utility functions can be used to estimate the market share, assuming each respondent will purchase the product with the highest total utility. For example, attribute values of an ICEV and a competing EV are given respectively, two total utilities of the two vehicles of a respondent are calculated and then knowing his/her choice. Once the choices of all respondents are known, an estimate of market share can be obtained. (2) Among the attributes, future price is the key attribute that dominates consumers’ choice. This study adopts the assumption of a learning curve to predict the future price of EVs. Based on the learning curve method and past price data of EVs, a regression model is established and the probability distribution function of the price of EVs in 2030 is obtained. (3) Since the future price is a random variable from the results of phase 2, a Monte Carlo simulation is then conducted to simulate the choices of all respondents by using their part-worth utility functions. For instance, using one thousand generated future prices of an EV together with other forecasted attribute values of the EV and an ICEV, one thousand market shares can be obtained with a Monte Carlo simulation. The resulting probability distribution of the market share of EVs provides more information than a fixed number forecast, reflecting the uncertain nature of the future development of EVs. The research results can help the auto industry and local government make more appropriate decisions and future action plans.

Keywords: conjoint model, electrical vehicle, learning curve, Monte Carlo simulation

Procedia PDF Downloads 48
2033 Nonparametric Truncated Spline Regression Model on the Data of Human Development Index in Indonesia

Authors: Kornelius Ronald Demu, Dewi Retno Sari Saputro, Purnami Widyaningsih

Abstract:

Human Development Index (HDI) is a standard measurement for a country's human development. Several factors may have influenced it, such as life expectancy, gross domestic product (GDP) based on the province's annual expenditure, the number of poor people, and the percentage of an illiterate people. The scatter plot between HDI and the influenced factors show that the plot does not follow a specific pattern or form. Therefore, the HDI's data in Indonesia can be applied with a nonparametric regression model. The estimation of the regression curve in the nonparametric regression model is flexible because it follows the shape of the data pattern. One of the nonparametric regression's method is a truncated spline. Truncated spline regression is one of the nonparametric approach, which is a modification of the segmented polynomial functions. The estimator of a truncated spline regression model was affected by the selection of the optimal knots point. Knot points is a focus point of spline truncated functions. The optimal knots point was determined by the minimum value of generalized cross validation (GCV). In this article were applied the data of Human Development Index with a truncated spline nonparametric regression model. The results of this research were obtained the best-truncated spline regression model to the HDI's data in Indonesia with the combination of optimal knots point 5-5-5-4. Life expectancy and the percentage of an illiterate people were the significant factors depend to the HDI in Indonesia. The coefficient of determination is 94.54%. This means the regression model is good enough to applied on the data of HDI in Indonesia.

Keywords: generalized cross validation (GCV), Human Development Index (HDI), knots point, nonparametric regression, truncated spline

Procedia PDF Downloads 310
2032 A Study of Algebraic Structure Involving Banach Space through Q-Analogue

Authors: Abdul Hakim Khan

Abstract:

The aim of the present paper is to study the Banach Space and Combinatorial Algebraic Structure of R. It is further aimed to study algebraic structure of set of all q-extension of classical formula and function for 0 < q < 1.

Keywords: integral functions, q-extensions, q numbers of metric space, algebraic structure of r and banach space

Procedia PDF Downloads 553
2031 Integration of STEM Education in Quebec, Canada – Challenges and Opportunities

Authors: B. El Fadil, R. Najar

Abstract:

STEM education is promoted by many scholars and curricula around the world, but it is not yet well established in the province of Quebec in Canada. In addition, effective instructional STEM activities and design methods are required to ensure that students and teachers' needs are being met. One potential method is the Engineering Design Process (EDP), a methodology that emphasizes the importance of creativity and collaboration in problem-solving strategies. This article reports on a case study that focused on using the EDP to develop instructional materials by means of making a technological artifact to teach mathematical variables and functions at the secondary level. The five iterative stages of the EDP (design, make, test, infer, and iterate) were integrated into the development of the course materials. Data was collected from different sources: pre- and post-questionnaires, as well as a working document dealing with pupils' understanding based on designing, making, testing, and simulating. Twenty-four grade seven (13 years old) students in Northern Quebec participated in the study. The findings of this study indicate that STEM activities have a positive impact not only on students' engagement in classroom activities but also on learning new mathematical concepts. Furthermore, STEM-focused activities have a significant effect on problem-solving skills development in an interdisciplinary approach. Based on the study's results, we can conclude, inter alia, that teachers should integrate STEM activities into their teaching practices to increase learning outcomes and attach more importance to STEM-focused activities to develop students' reflective thinking and hands-on skills.

Keywords: engineering design process, motivation, stem, integration, variables, functions

Procedia PDF Downloads 70
2030 [Keynote Talk]: Applying p-Balanced Energy Technique to Solve Liouville-Type Problems in Calculus

Authors: Lina Wu, Ye Li, Jia Liu

Abstract:

We are interested in solving Liouville-type problems to explore constancy properties for maps or differential forms on Riemannian manifolds. Geometric structures on manifolds, the existence of constancy properties for maps or differential forms, and energy growth for maps or differential forms are intertwined. In this article, we concentrate on discovery of solutions to Liouville-type problems where manifolds are Euclidean spaces (i.e. flat Riemannian manifolds) and maps become real-valued functions. Liouville-type results of vanishing properties for functions are obtained. The original work in our research findings is to extend the q-energy for a function from finite in Lq space to infinite in non-Lq space by applying p-balanced technique where q = p = 2. Calculation skills such as Hölder's Inequality and Tests for Series have been used to evaluate limits and integrations for function energy. Calculation ideas and computational techniques for solving Liouville-type problems shown in this article, which are utilized in Euclidean spaces, can be universalized as a successful algorithm, which works for both maps and differential forms on Riemannian manifolds. This innovative algorithm has a far-reaching impact on research work of solving Liouville-type problems in the general settings involved with infinite energy. The p-balanced technique in this algorithm provides a clue to success on the road of q-energy extension from finite to infinite.

Keywords: differential forms, holder inequality, Liouville-type problems, p-balanced growth, p-harmonic maps, q-energy growth, tests for series

Procedia PDF Downloads 213
2029 The Soundscape of Contemporary Buddhist Music in Taiwan: Tzu Chi Vesak Ceremony

Authors: Sylvia Huang

Abstract:

Contemporary Buddhist music has been emerged at the new forms of large-scale public Buddhist ritual ceremonies that may involve up to 10,000 participants at a time. Since 2007, the Buddha’s Birthday ceremony (Sanskrit, Vesak) by the Buddhist Tzu Chi Foundation has being held at major cities in Taiwan and many affiliated Tzu Chi offices around the world. Analysis of this modern and technologically-dependent ceremony sheds new light on the significance of music in contemporary Buddhist ritual, and also on recently enhanced and increasingly intimate connections between music and Buddhism. Through extensive ethnographic research of ten years (2007-2017), the research explores how the form of contemporary Buddhist music relates to the role of music in participants’ experience of the ritual and the way in which they construct meaning. The theoretical approach draws on both ethnomusicology and Buddhist teachings, Dharma. As soundscape is defined as the entire sonic energy produced by a landscape, the concept of soundscape is utilised to examine the contemporary ritual music in the Tzu Chi Vesak ceremony. The analysis opens new territory in exploring how analysis of Buddhist music can benefit from incorporating Buddhist philosophy within the methodological approach. Main findings are: 1) music becomes a method for Buddhist understanding through a focus in particular on how the ceremonial program is followed by music, and 2) participants engage with each other and entrain with music in the Vesak ceremony. As Buddhist sounding, such as scripture reading, liturgical chanting, and ceremonial music singing, is a sonic epistemological knowing of the conditions in which Buddhism is practiced, experienced, and transmigrated, the research concludes by showing that studies of Buddhist music have the potential to reveal distinctively Buddhist concepts, meaning, and values. Certain principles of Buddhist philosophy are adopted within ethnomusicological analysis to further enhance understandings of the crucial function of music within such a ritual context. Finally, the contemporary Buddhist music performance in the ceremony is possessed as a means of direct access to the spiritual experience in Buddhism.

Keywords: buddhist music, Taiwan, soundscape, Vesak ceremony

Procedia PDF Downloads 114
2028 R Statistical Software Applied in Reliability Analysis: Case Study of Diesel Generator Fans

Authors: Jelena Vucicevic

Abstract:

Reliability analysis represents a very important task in different areas of work. In any industry, this is crucial for maintenance, efficiency, safety and monetary costs. There are ways to calculate reliability, unreliability, failure density and failure rate. This paper will try to introduce another way of calculating reliability by using R statistical software. R is a free software environment for statistical computing and graphics. It compiles and runs on a wide variety of UNIX platforms, Windows and MacOS. The R programming environment is a widely used open source system for statistical analysis and statistical programming. It includes thousands of functions for the implementation of both standard and new statistical methods. R does not limit user only to operation related only to these functions. This program has many benefits over other similar programs: it is free and, as an open source, constantly updated; it has built-in help system; the R language is easy to extend with user-written functions. The significance of the work is calculation of time to failure or reliability in a new way, using statistic. Another advantage of this calculation is that there is no need for technical details and it can be implemented in any part for which we need to know time to fail in order to have appropriate maintenance, but also to maximize usage and minimize costs. In this case, calculations have been made on diesel generator fans but the same principle can be applied to any other part. The data for this paper came from a field engineering study of the time to failure of diesel generator fans. The ultimate goal was to decide whether or not to replace the working fans with a higher quality fan to prevent future failures. Seventy generators were studied. For each one, the number of hours of running time from its first being put into service until fan failure or until the end of the study (whichever came first) was recorded. Dataset consists of two variables: hours and status. Hours show the time of each fan working and status shows the event: 1- failed, 0- censored data. Censored data represent cases when we cannot track the specific case, so it could fail or success. Gaining the result by using R was easy and quick. The program will take into consideration censored data and include this into the results. This is not so easy in hand calculation. For the purpose of the paper results from R program have been compared to hand calculations in two different cases: censored data taken as a failure and censored data taken as a success. In all three cases, results are significantly different. If user decides to use the R for further calculations, it will give more precise results with work on censored data than the hand calculation.

Keywords: censored data, R statistical software, reliability analysis, time to failure

Procedia PDF Downloads 379
2027 Hydrological Modelling of Geological Behaviours in Environmental Planning for Urban Areas

Authors: Sheetal Sharma

Abstract:

Runoff,decreasing water levels and recharge in urban areas have been a complex issue now a days pointing defective urban design and increasing demography as cause. Very less has been discussed or analysed for water sensitive Urban Master Plans or local area plans. Land use planning deals with land transformation from natural areas into developed ones, which lead to changes in natural environment. Elaborated knowledge of relationship between the existing patterns of land use-land cover and recharge with respect to prevailing soil below is less as compared to speed of development. The parameters of incompatibility between urban functions and the functions of the natural environment are becoming various. Changes in land patterns due to built up, pavements, roads and similar land cover affects surface water flow seriously. It also changes permeability and absorption characteristics of the soil. Urban planners need to know natural processes along with modern means and best technologies available,as there is a huge gap between basic knowledge of natural processes and its requirement for balanced development planning leading to minimum impact on water recharge. The present paper analyzes the variations in land use land cover and their impacts on surface flows and sub-surface recharge in study area. The methodology adopted was to analyse the changes in land use and land cover using GIS and Civil 3d auto cad. The variations were used in  computer modeling using Storm-water Management Model to find out the runoff for various soil groups and resulting recharge observing water levels in POW data for last 40 years of the study area. Results were anlayzed again to find best correlations for sustainable recharge in urban areas.

Keywords: geology, runoff, urban planning, land use-land cover

Procedia PDF Downloads 290
2026 Life Prediction Method of Lithium-Ion Battery Based on Grey Support Vector Machines

Authors: Xiaogang Li, Jieqiong Miao

Abstract:

As for the problem of the grey forecasting model prediction accuracy is low, an improved grey prediction model is put forward. Firstly, use trigonometric function transform the original data sequence in order to improve the smoothness of data , this model called SGM( smoothness of grey prediction model), then combine the improved grey model with support vector machine , and put forward the grey support vector machine model (SGM - SVM).Before the establishment of the model, we use trigonometric functions and accumulation generation operation preprocessing data in order to enhance the smoothness of the data and weaken the randomness of the data, then use support vector machine (SVM) to establish a prediction model for pre-processed data and select model parameters using genetic algorithms to obtain the optimum value of the global search. Finally, restore data through the "regressive generate" operation to get forecasting data. In order to prove that the SGM-SVM model is superior to other models, we select the battery life data from calce. The presented model is used to predict life of battery and the predicted result was compared with that of grey model and support vector machines.For a more intuitive comparison of the three models, this paper presents root mean square error of this three different models .The results show that the effect of grey support vector machine (SGM-SVM) to predict life is optimal, and the root mean square error is only 3.18%. Keywords: grey forecasting model, trigonometric function, support vector machine, genetic algorithms, root mean square error

Keywords: Grey prediction model, trigonometric functions, support vector machines, genetic algorithms, root mean square error

Procedia PDF Downloads 436