Search results for: Jeffrey’s measure
3318 Effect of Homogeneous and Heterogeneous Chemical Reactions on Peristaltic Flow of a Jeffrey Fluid in an Asymmetric Channel
Authors: G. Ravi Kiran, G. Radhakrishnamacharya
Abstract:
In this paper, the dispersion of a solute in the peristaltic flow of a Jeffrey fluid in the presence of both homogeneous and heterogeneous chemical reactions has been discussed. The average effective dispersion coefficient has been found using Taylor's limiting condition under long wavelength approximation. It is observed that the average dispersion coefficient increases with amplitude ratio which implies that dispersion is more in the presence of peristalsis. The average effective dispersion coefficient increases with Jeffrey parameter in the cases of both homogeneous and combined homogeneous and heterogeneous chemical reactions. Further, dispersion decreases with a phase difference, homogeneous reaction rate parameters, and heterogeneous reaction rate parameter.Keywords: peristalsis, dispersion, chemical reaction, Jeffrey fluid, asymmetric channel
Procedia PDF Downloads 5873317 The Non-Linear Analysis of Brain Response to Visual Stimuli
Authors: H. Namazi, H. T. N. Kuan
Abstract:
Brain activity can be measured by acquiring and analyzing EEG signals from an individual. In fact, the human brain response to external and internal stimuli is mapped in his EEG signals. During years some methods such as Fourier transform, wavelet transform, empirical mode decomposition, etc. have been used to analyze the EEG signals in order to find the effect of stimuli, especially external stimuli. But each of these methods has some weak points in analysis of EEG signals. For instance, Fourier transform and wavelet transform methods are linear signal analysis methods which are not good to be used for analysis of EEG signals as nonlinear signals. In this research we analyze the brain response to visual stimuli by extracting information in the form of various measures from EEG signals using a software developed by our research group. The used measures are Jeffrey’s measure, Fractal dimension and Hurst exponent. The results of these analyses are useful not only for fundamental understanding of brain response to visual stimuli but provide us with very good recommendations for clinical purposes.Keywords: visual stimuli, brain response, EEG signal, fractal dimension, hurst exponent, Jeffrey’s measure
Procedia PDF Downloads 5623316 The Analysis of Brain Response to Auditory Stimuli through EEG Signals’ Non-Linear Analysis
Authors: H. Namazi, H. T. N. Kuan
Abstract:
Brain activity can be measured by acquiring and analyzing EEG signals from an individual. In fact, the human brain response to external and internal stimuli is mapped in his EEG signals. During years some methods such as Fourier transform, wavelet transform, empirical mode decomposition, etc. have been used to analyze the EEG signals in order to find the effect of stimuli, especially external stimuli. But each of these methods has some weak points in analysis of EEG signals. For instance, Fourier transform and wavelet transform methods are linear signal analysis methods which are not good to be used for analysis of EEG signals as nonlinear signals. In this research we analyze the brain response to auditory stimuli by extracting information in the form of various measures from EEG signals using a software developed by our research group. The used measures are Jeffrey’s measure, Fractal dimension and Hurst exponent. The results of these analyses are useful not only for fundamental understanding of brain response to auditory stimuli but provide us with very good recommendations for clinical purposes.Keywords: auditory stimuli, brain response, EEG signal, fractal dimension, hurst exponent, Jeffrey’s measure
Procedia PDF Downloads 5343315 FRATSAN: A New Software for Fractal Analysis of Signals
Authors: Hamidreza Namazi
Abstract:
Fractal analysis is assessing fractal characteristics of data. It consists of several methods to assign fractal characteristics to a dataset which may be a theoretical dataset or a pattern or signal extracted from phenomena including natural geometric objects, sound, market fluctuations, heart rates, digital images, molecular motion, networks, etc. Fractal analysis is now widely used in all areas of science. An important limitation of fractal analysis is that arriving at an empirically determined fractal dimension does not necessarily prove that a pattern is fractal; rather, other essential characteristics have to be considered. For this purpose a Visual C++ based software called FRATSAN (FRActal Time Series ANalyser) was developed which extract information from signals through three measures. These measures are Fractal Dimensions, Jeffrey’s Measure and Hurst Exponent. After computing these measures, the software plots the graphs for each measure. Besides computing three measures the software can classify whether the signal is fractal or no. In fact, the software uses a dynamic method of analysis for all the measures. A sliding window is selected with a value equal to 10% of the total number of data entries. This sliding window is moved one data entry at a time to obtain all the measures. This makes the computation very sensitive to slight changes in data, thereby giving the user an acute analysis of the data. In order to test the performance of this software a set of EEG signals was given as input and the results were computed and plotted. This software is useful not only for fundamental fractal analysis of signals but can be used for other purposes. For instance by analyzing the Hurst exponent plot of a given EEG signal in patients with epilepsy the onset of seizure can be predicted by noticing the sudden changes in the plot.Keywords: EEG signals, fractal analysis, fractal dimension, hurst exponent, Jeffrey’s measure
Procedia PDF Downloads 4693314 Peristaltic Transport of a Jeffrey Fluid with Double-Diffusive Convection in Nanofluids in the Presence of Inclined Magnetic Field
Authors: Safia Akram
Abstract:
In this article, the effects of peristaltic transport with double-diffusive convection in nanofluids through an asymmetric channel with different waveforms is presented. Mathematical modelling for two-dimensional and two directional flows of a Jeffrey fluid model along with double-diffusive convection in nanofluids are given. Exact solutions are obtained for nanoparticle fraction field, concentration field, temperature field, stream functions, pressure gradient and pressure rise in terms of axial and transverse coordinates under the restrictions of long wavelength and low Reynolds number. With the help of computational and graphical results the effects of Brownian motion, thermophoresis, Dufour, Soret, and Grashof numbers (thermal, concentration, nanoparticles) on peristaltic flow patterns with double-diffusive convection are discussed.Keywords: nanofluid particles, peristaltic flow, Jeffrey fluid, magnetic field, asymmetric channel, different waveforms
Procedia PDF Downloads 3843313 Entropy-Based Multichannel Stationary Measure for Characterization of Non-Stationary Patterns
Authors: J. D. Martínez-Vargas, C. Castro-Hoyos, G. Castellanos-Dominguez
Abstract:
In this work, we propose a novel approach for measuring the stationarity level of a multichannel time-series. This measure is based on a stationarity definition over time-varying spectrum, and it is aimed to quantify the relation between local stationarity (single-channel) and global dynamic behavior (multichannel dynamics). To assess the proposed approach validity, we use a well known EEG-BCI database, that was constructed for separate between motor/imagery tasks. Thus, based on the statement that imagination of movements implies an increase on the EEG dynamics, we use as discriminant features the proposed measure computed over an estimation of the non-stationary components of input time-series. As measure of separability we use a t-student test, and the obtained results evidence that such measure is able to accurately detect the brain areas projected on the scalp where motor tasks are realized.Keywords: stationary measure, entropy, sub-space projection, multichannel dynamics
Procedia PDF Downloads 4143312 A Similarity Measure for Classification and Clustering in Image Based Medical and Text Based Banking Applications
Authors: K. P. Sandesh, M. H. Suman
Abstract:
Text processing plays an important role in information retrieval, data-mining, and web search. Measuring the similarity between the documents is an important operation in the text processing field. In this project, a new similarity measure is proposed. To compute the similarity between two documents with respect to a feature the proposed measure takes the following three cases into account: (1) The feature appears in both documents; (2) The feature appears in only one document and; (3) The feature appears in none of the documents. The proposed measure is extended to gauge the similarity between two sets of documents. The effectiveness of our measure is evaluated on several real-world data sets for text classification and clustering problems, especially in banking and health sectors. The results show that the performance obtained by the proposed measure is better than that achieved by the other measures.Keywords: document classification, document clustering, entropy, accuracy, classifiers, clustering algorithms
Procedia PDF Downloads 5183311 Entropy Measures on Neutrosophic Soft Sets and Its Application in Multi Attribute Decision Making
Authors: I. Arockiarani
Abstract:
The focus of the paper is to furnish the entropy measure for a neutrosophic set and neutrosophic soft set which is a measure of uncertainty and it permeates discourse and system. Various characterization of entropy measures are derived. Further we exemplify this concept by applying entropy in various real time decision making problems.Keywords: entropy measure, Hausdorff distance, neutrosophic set, soft set
Procedia PDF Downloads 2573310 Free Will and Compatibilism in Decision Theory: A Solution to Newcomb’s Paradox
Authors: Sally Heyeon Hwang
Abstract:
Within decision theory, there are normative principles that dictate how one should act in addition to empirical theories of actual behavior. As a normative guide to one’s actual behavior, evidential or causal decision-theoretic equations allow one to identify outcomes with maximal utility values. The choice that each person makes, however, will, of course, differ according to varying assignments of weight and probability values. Regarding these different choices, it remains a subject of considerable philosophical controversy whether individual subjects have the capacity to exercise free will with respect to the assignment of probabilities, or whether instead the assignment is in some way constrained. A version of this question is given a precise form in Richard Jeffrey’s assumption that free will is necessary for Newcomb’s paradox to count as a decision problem. This paper will argue, against Jeffrey, that decision theory does not require the assumption of libertarian freedom. One of the hallmarks of decision-making is its application across a wide variety of contexts; the implications of a background assumption of free will is similarly varied. One constant across the contexts of decision is that there are always at least two levels of choice for a given agent, depending on the degree of prior constraint. Within the context of Newcomb’s problem, when the predictor is attempting to guess the choice the agent will make, he or she is analyzing the determined aspects of the agent such as past characteristics, experiences, and knowledge. On the other hand, as David Lewis’ backtracking argument concerning the relationship between past and present events brings to light, there are similarly varied ways in which the past can actually be dependent on the present. One implication of this argument is that even in deterministic settings, an agent can have more free will than it may seem. This paper will thus argue against the view that a stable background assumption of free will or determinism in decision theory is necessary, arguing instead for a compatibilist decision theory yielding a novel treatment of Newcomb’s problem.Keywords: decision theory, compatibilism, free will, Newcomb’s problem
Procedia PDF Downloads 3223309 Using a Quantitative Reasoning Framework to Help Students Understand Arc Measure Relationships
Authors: David Glassmeyer
Abstract:
Quantitative reasoning is necessary to robustly understand mathematical concepts ranging from elementary to university levels. Quantitative reasoning involves identifying and representing quantities and the relationships between these quantities. Without reasoning quantitatively, students often resort to memorizing formulas and procedures, which have negative impacts when they encounter mathematical topics in the future. This study investigated how high school students’ quantitative reasoning could be fostered within a unit on arc measure and angle relationships. Arc measure, or the measure of a central angle that cuts off a portion of a circle’s circumference, is often confused with arclength. In this study, the researcher redesigned an activity to clearly distinguish arc measure and arc length by using a quantitative reasoning framework. Data were collected from high school students to determine if this approach impacted their understanding of these concepts. Initial data indicates the approach was successful in supporting students’ quantitative reasoning of these topics. Implications for the work are that teachers themselves may also benefit from considering mathematical definitions from a quantitative reasoning framework and can use this activity in their own classrooms.Keywords: arc length, arc measure, quantitative reasoning, student content knowledge
Procedia PDF Downloads 2583308 Dissimilarity Measure for General Histogram Data and Its Application to Hierarchical Clustering
Authors: K. Umbleja, M. Ichino
Abstract:
Symbolic data mining has been developed to analyze data in very large datasets. It is also useful in cases when entry specific details should remain hidden. Symbolic data mining is quickly gaining popularity as datasets in need of analyzing are becoming ever larger. One type of such symbolic data is a histogram, which enables to save huge amounts of information into a single variable with high-level of granularity. Other types of symbolic data can also be described in histograms, therefore making histogram a very important and general symbolic data type - a method developed for histograms - can also be applied to other types of symbolic data. Due to its complex structure, analyzing histograms is complicated. This paper proposes a method, which allows to compare two histogram-valued variables and therefore find a dissimilarity between two histograms. Proposed method uses the Ichino-Yaguchi dissimilarity measure for mixed feature-type data analysis as a base and develops a dissimilarity measure specifically for histogram data, which allows to compare histograms with different number of bins and bin widths (so called general histogram). Proposed dissimilarity measure is then used as a measure for clustering. Furthermore, linkage method based on weighted averages is proposed with the concept of cluster compactness to measure the quality of clustering. The method is then validated with application on real datasets. As a result, the proposed dissimilarity measure is found producing adequate and comparable results with general histograms without the loss of detail or need to transform the data.Keywords: dissimilarity measure, hierarchical clustering, histograms, symbolic data analysis
Procedia PDF Downloads 1623307 A Tool to Measure the Usability Guidelines for Arab E-Government Websites
Authors: Omyma Alosaimi, Asma Alsumait
Abstract:
The website developer and designer should follow usability guidelines to provide a user-friendly interface. Using tools to measure usability, the evaluator can evaluate automatically hundreds of links within few minutes. It has the advantage of detecting some violations that only machines can detect. For that using usability evaluating tool is important to find as many violations as possible. There are many websites usability testing tools, but none is developed to measure the usability of e-government website nor Arabic e-government websites. To measure the usability of the Arabic e-government websites, a tool is developed and tested in this paper. A comparison of using a tool specifically developed for e-government websites and general usability testing tool is presented.Keywords: e-government, human computer interaction, usability evaluation, usability guidelines
Procedia PDF Downloads 4233306 Upon Further Reflection: More on the History, Tripartite Role, and Challenges of the Professoriate
Authors: Jeffrey R. Mueller
Abstract:
This paper expands on the role of the professor by detailing the origins of the profession, adding some of the unique contributions of North American Universities, as well as some of the best practice recommendations, to the unique tripartite role of the professor. It describes current challenges to the profession including the ever-controversial student rating of professors. It continues with the significance of empowerment to the role of the professor. It concludes with a predictive prescription for the future of the professoriate and the role of the university-level educational administrator toward that end.Keywords: professoriate history, tripartite role, challenges, empowerment, shared governance, administratization
Procedia PDF Downloads 4013305 A Look into Surgical Site Infections: Impact of Collective Interventions
Authors: Lisa Bennett, Cynthia Walters, Cynthia Argani, Andy Satin, Geeta Sood, Kerri Huber, Lisa Grubb, Woodrow Noble, Melissa Eichelberger, Darlene Zinalabedini, Eric Ausby, Jeffrey Snyder, Kevin Kirchoff
Abstract:
Background: Surgical site infections (SSIs) within the obstetric population pose a variety of complications, creating clinical and personal challenges for the new mother and her neonate during the postpartum period. Our journey to achieve compliance with the SSI core measure for cesarean sections revealed many opportunities to improve these outcomes. Objective: Achieve and sustain core measure compliance keeping surgical site infection rates below the national benchmark pooled mean of 1.8% in post-operative patients, who delivered via cesarean section at the Johns Hopkins Bayview Medical Center. Methods: A root cause analysis was performed and revealed several environmental, pharmacologic, and clinical practice opportunities for improvement. A multidisciplinary approach led by the OB Safety Nurse, OB Medical Director, and Infectious Disease Department resulted in the implementation of fourteen interventions over a twenty-month period. Interventions included: post-operative dressing changes, standardizing operating room attire, broadening pre-operative antibiotics, initiating vaginal preps, improving operating room terminal cleaning, testing air quality, and re-educating scrub technicians on technique. Results: Prior to the implementation of our interventions, the SSI quarterly rate in Obstetrics peaked at 6.10%. Although no single intervention resulted in dramatic improvement, after implementation of all fourteen interventions, the quarterly SSI rate has subsequently ranged from to 0.0% to 2.70%. Significance: Taking an introspective look at current practices can reveal opportunities for improvement which previously were not considered. Collectively the benefit of these interventions has shown a significant decrease in surgical site infection rates. The impact of this quality improvement project highlights the synergy created when members of the multidisciplinary team work in collaboration to improve patient safety, and achieve a high quality of care.Keywords: cesarean section, surgical site infection, collaboration and teamwork, patient safety, quality improvement
Procedia PDF Downloads 4823304 Documents Emotions Classification Model Based on TF-IDF Weighting Measure
Authors: Amr Mansour Mohsen, Hesham Ahmed Hassan, Amira M. Idrees
Abstract:
Emotions classification of text documents is applied to reveal if the document expresses a determined emotion from its writer. As different supervised methods are previously used for emotion documents’ classification, in this research we present a novel model that supports the classification algorithms for more accurate results by the support of TF-IDF measure. Different experiments have been applied to reveal the applicability of the proposed model, the model succeeds in raising the accuracy percentage according to the determined metrics (precision, recall, and f-measure) based on applying the refinement of the lexicon, integration of lexicons using different perspectives, and applying the TF-IDF weighting measure over the classifying features. The proposed model has also been compared with other research to prove its competence in raising the results’ accuracy.Keywords: emotion detection, TF-IDF, WEKA tool, classification algorithms
Procedia PDF Downloads 4843303 Literary Words of Foreign Origin as Social Markers in Jeffrey Archer's Novels Speech Portrayals
Authors: Tatiana Ivushkina
Abstract:
The paper is aimed at studying the use of literary words of foreign origin in modern fiction from a sociolinguistic point of view, which presupposes establishing correlation between this category of words in a speech portrayal or narrative and a social status of the speaker, verifying that it bears social implications and serves as a social marker or index of socially privileged identity in the British literature of the 21-st century. To this end, there were selected literary words of foreign origin in context (60 contexts) and subjected to careful examination. The study is carried out on two novels by Jeffrey Archer – Not a Penny More, Not a Penny Less and A Prisoner of Birth – who, being a graduate from Oxford, represents socially privileged classes himself and gives a wide depiction of characters with different social backgrounds and statuses. The analysis of the novels enabled us to categorize the selected words into four relevant groups. The first represented by terms (commodity, debenture, recuperation, syringe, luminescence, umpire, etc.) serves to unambiguously indicate education, occupation, a field of knowledge in which a character is involved or a situation of communication. The second group is formed of words used in conjunction with their Germanic counterparts (perspiration – sweat, padre – priest, convivial – friendly) to contrast social position of the characters: literary words serving as social indices of upper class speakers whereas their synonyms of Germanic origin characterize middle or lower class speech portrayals. The third class of words comprises socially marked words (verbs, nouns, and adjectives), or U-words (the term first coined by Allan Ross and Nancy Mitford), the status acquired in the course of social history development (elegant, excellent, sophistication, authoritative, preposterous, etc.). The fourth includes words used in a humorous or ironic meaning to convey the narrator’s attitude to the characters or situation itself (ministrations, histrionic, etc.). Words of this group are perceived as 'alien', stylistically distant as they create incongruity between style and subject matter. Social implication of the selected words is enhanced by French words and phrases often accompanying them.Keywords: British literature of the XXI century, literary words of foreign origin, social context, social meaning
Procedia PDF Downloads 1343302 A Review of Physiological Measures for Cognitive Workload Assessment of Aircrew
Authors: Naveed Tahir, Adnan Maqsood
Abstract:
Cognitive workload is a significant factor affecting user performance, and it has been broadly investigated for its application in ergonomics as well as in designing and optimizing effective human-machine interactions. It is mentally challenging to maneuver an aircraft, and pilots must control the aircraft and adequately communicate to the verbal-auditory stimuli. Several physiological measures have long been researched and used to demonstrate the cognitive workload. In our current study, we have summarized recent findings of the effectiveness, accuracy, and applicability of commonly used physiological measures in evaluating cognitive workload. We have also highlighted on the advancements in physiological measures. The strength and limitations of physiological measures have also been discussed to assess the cognitive workload of people, especially the aircrews in laboratory settings and real-time situations. We have presented the research findings of the physiological measures to base suggestions on the proper applications of the measures and settings demanding the use of single measure or their combinations.Keywords: aircrew, cognitive workload, subjective measure, physiological measure, performance measure
Procedia PDF Downloads 1623301 A Study on the Acquisition of Chinese Classifiers by Vietnamese Learners
Authors: Quoc Hung Le Pham
Abstract:
In the field of language study, classifier is an interesting research feature. In the world’s languages, some languages have classifier system, some do not. Mandarin Chinese and Vietnamese languages are a rich classifier system, however, because of the language system, the cognitive, cultural differences, so that the syntactic structure of classifier of them also dissimilar. When using Mandarin Chinese classifiers must collocate with nouns or verbs, in the lexical category it is not like nouns or verbs, belong to the open class. But some scholars believe that Mandarin Chinese measure words are similar to English and other Indo European languages. The word hanging on the structure and word formation (suffix), is a closed class. Compared to other languages, such as Chinese, Vietnamese, Thai and other Asian languages are still belonging to the classifier language’s second type, this type of language is classifier, it is in the majority of quantity must exist, and following deictic, anaphoric or quantity appearing together, not separation between its modified noun, also known as numeral classifier language. Main syntactic structure of Chinese classifiers are as follows: ‘quantity+measure+noun’, ‘pronoun+measure+noun’, ‘pronoun+quantity+measure+noun’, ‘prefix+quantity+measure +noun’, ‘quantity +adjective + measure +noun’, ‘ quantity (above 10 whole number), + duo (多)measure +noun’, ‘ quantity (around 10) + measure + duo (多) +noun’. Main syntactic structure of Vietnamese classifiers are: ‘quantity+measure+noun’, ‘ measure+noun+pronoun’, ‘quantity+measure+noun+pronoun’, ‘measure+noun+prefix+ quantity’, ‘quantity+measure+noun+adjective', ‘duo (多) +quanlity+measure+noun’, ‘quantity+measure+adjective+pronoun (quantity word could not be 1)’, ‘measure+adjective+pronoun’, ‘measure+pronoun’. In daily life, classifiers are commonly used, if Chinese learners failed to standardize this using catergory, because the negative impact might occur on their verbal communication. The richness of the Chinese classifier system contributes to the complexity in the study of the system by foreign learners, especially in the inter language of Vietnamese learners. As above mentioned, Vietnamese language also has a rich system of classifiers, however, the basic structure order of two languages are similar but both still have differences. These similarities and dissimilarities between Chinese and Vietnamese classifier systems contribute significantly to the common errors made by Vietnamese students while they acquire Chinese, which are distinct from the errors made by students from the other language background. This article from a comparative perspective of language, has an orientation towards Chinese and Vietnamese languages commonly used in classifiers semantics and structural form two aspects. This comparative study aims to identity Vietnamese students while learning Chinese classifiers may face some negative transference of mother language, beside that through the analysis of the classifiers questionnaire, find out the causes and patterns of the errors they made. As the preliminary analysis shows, Vietnamese students while learning Chinese classifiers made some errors such as: overuse classifier ‘ge’(个); misuse the other classifiers ‘*yi zhang ri ji’(yi pian ri ji), ‘*yi zuo fang zi’(yi jian fang zi), ‘*si zhang jin pai’(si mei jin pai); homonym words ‘dui, shuang, fu, tao’ (对、双、副、套), ‘ke, li’ (颗、粒).Keywords: acquisition, classifiers, negative transfer, Vietnamse learners
Procedia PDF Downloads 4543300 A Modified Shannon Entropy Measure for Improved Image Segmentation
Authors: Mohammad A. U. Khan, Omar A. Kittaneh, M. Akbar, Tariq M. Khan, Husam A. Bayoud
Abstract:
The Shannon Entropy measure has been widely used for measuring uncertainty. However, in partial settings, the histogram is used to estimate the underlying distribution. The histogram is dependent on the number of bins used. In this paper, a modification is proposed that makes the Shannon entropy based on histogram consistent. For providing the benefits, two application are picked in medical image processing applications. The simulations are carried out to show the superiority of this modified measure for image segmentation problem. The improvement may be contributed to robustness shown to uneven background in images.Keywords: Shannon entropy, medical image processing, image segmentation, modification
Procedia PDF Downloads 4973299 On Generalized Cumulative Past Inaccuracy Measure for Marginal and Conditional Lifetimes
Authors: Amit Ghosh, Chanchal Kundu
Abstract:
Recently, the notion of past cumulative inaccuracy (CPI) measure has been proposed in the literature as a generalization of cumulative past entropy (CPE) in univariate as well as bivariate setup. In this paper, we introduce the notion of CPI of order α (alpha) and study the proposed measure for conditionally specified models of two components failed at different time instants called generalized conditional CPI (GCCPI). We provide some bounds using usual stochastic order and investigate several properties of GCCPI. The effect of monotone transformation on this proposed measure has also been examined. Furthermore, we characterize some bivariate distributions under the assumption of conditional proportional reversed hazard rate model. Moreover, the role of GCCPI in reliability modeling has also been investigated for a real-life problem.Keywords: cumulative past inaccuracy, marginal and conditional past lifetimes, conditional proportional reversed hazard rate model, usual stochastic order
Procedia PDF Downloads 2543298 A Cohort and Empirical Based Multivariate Mortality Model
Authors: Jeffrey Tzu-Hao Tsai, Yi-Shan Wong
Abstract:
This article proposes a cohort-age-period (CAP) model to characterize multi-population mortality processes using cohort, age, and period variables. Distinct from the factor-based Lee-Carter-type decomposition mortality model, this approach is empirically based and includes the age, period, and cohort variables into the equation system. The model not only provides a fruitful intuition for explaining multivariate mortality change rates but also has a better performance in forecasting future patterns. Using the US and the UK mortality data and performing ten-year out-of-sample tests, our approach shows smaller mean square errors in both countries compared to the models in the literature.Keywords: longevity risk, stochastic mortality model, multivariate mortality rate, risk management
Procedia PDF Downloads 563297 Maximizing Coverage with Mobile Crime Cameras in a Stochastic Spatiotemporal Bipartite Network
Authors: (Ted) Edward Holmberg, Mahdi Abdelguerfi, Elias Ioup
Abstract:
This research details a coverage measure for evaluating the effectiveness of observer node placements in a spatial bipartite network. This coverage measure can be used to optimize the configuration of stationary or mobile spatially oriented observer nodes, or a hybrid of the two, over time in order to fully utilize their capabilities. To demonstrate the practical application of this approach, we construct a SpatioTemporal Bipartite Network (STBN) using real-time crime center (RTCC) camera nodes and NOPD calls for service (CFS) event nodes from New Orleans, La (NOLA). We use the coverage measure to identify optimal placements for moving mobile RTCC camera vans to improve coverage of vulnerable areas based on temporal patterns.Keywords: coverage measure, mobile node dynamics, Monte Carlo simulation, observer nodes, observable nodes, spatiotemporal bipartite knowledge graph, temporal spatial analysis
Procedia PDF Downloads 1163296 Study on Concentration and Temperature Measurement with 760 nm Diode Laser in Combustion System Using Tunable Diode Laser Absorption Spectroscopy
Authors: Miyeon Yoo, Sewon Kim, Changyeop Lee
Abstract:
It is important to measure the internal temperature or temperature distribution precisely in combustion system to increase energy efficiency and reduce the pollutants. Especially in case of large combustion systems such as power plant boiler and reheating furnace of steel making process, it is very difficult to measure those physical properties in detail. Tunable diode laser absorption spectroscopy measurement and analysis can be attractive method to overcome the difficulty. In this paper, TDLAS methods are used to measure the oxygen concentration and temperature distribution in various experimental conditions.Keywords: tunable diode laser absorption Spectroscopy, temperature distribution, gas concentration
Procedia PDF Downloads 3873295 Measure-Valued Solutions to a Class of Nonlinear Parabolic Equations with Degenerate Coercivity and Singular Initial Data
Authors: Flavia Smarrazzo
Abstract:
Initial-boundary value problems for nonlinear parabolic equations having a Radon measure as initial data have been widely investigated, looking for solutions which for positive times take values in some function space. On the other hand, if the diffusivity degenerates too fast at infinity, it is well known that function-valued solutions may not exist, singularities may persist, and it looks very natural to consider solutions which, roughly speaking, for positive times describe an orbit in the space of the finite Radon measures. In this general framework, our purpose is to introduce a concept of measure-valued solution which is consistent with respect to regularizing and smoothing approximations, in order to develop an existence theory which does not depend neither on the level of degeneracy of diffusivity at infinity nor on the choice of the initial measures. In more detail, we prove existence of suitably defined measure-valued solutions to the homogeneous Dirichlet initial-boundary value problem for a class of nonlinear parabolic equations without strong coerciveness. Moreover, we also discuss some qualitative properties of the constructed solutions concerning the evolution of their singular part, including conditions (depending both on the initial data and on the strength of degeneracy) under which the constructed solutions are in fact unction-valued or not.Keywords: degenerate parabolic equations, measure-valued solutions, Radon measures, young measures
Procedia PDF Downloads 2843294 Decision-Making, Student Empathy, and Cold War Historical Events: A Case Study of Abstract Thinking through Content-Centered Learning
Authors: Jeffrey M. Byford
Abstract:
The conceptualized theory of decision making on historical events often does not conform to uniform beliefs among students. When presented the opportunity, many students have differing opinions and rationales associated with historical events and outcomes. The intent of this paper was to provide students with the economic, social and political dilemmas associated with the autonomy of East Berlin. Students ranked seven possible actions from the most to least acceptable. In addition, students were required to provide both positive and negative factors for each decision and relative ranking. Results from this activity suggested that while most students chose a financial action towards West Berlin, some students had trouble justifying their actions.Keywords: content-centered learning, cold war, Berlin, decision-making
Procedia PDF Downloads 4563293 The Use of AI to Measure Gross National Happiness
Authors: Riona Dighe
Abstract:
This research attempts to identify an alternative approach to the measurement of Gross National Happiness (GNH). It uses artificial intelligence (AI), incorporating natural language processing (NLP) and sentiment analysis to measure GNH. We use ‘off the shelf’ NLP models responsible for the sentiment analysis of a sentence as a building block for this research. We constructed an algorithm using NLP models to derive a sentiment analysis score against sentences. This was then tested against a sample of 20 respondents to derive a sentiment analysis score. The scores generated resembled human responses. By utilising the MLP classifier, decision tree, linear model, and K-nearest neighbors, we were able to obtain a test accuracy of 89.97%, 54.63%, 52.13%, and 47.9%, respectively. This gave us the confidence to use the NLP models against sentences in websites to measure the GNH of a country.Keywords: artificial intelligence, NLP, sentiment analysis, gross national happiness
Procedia PDF Downloads 1243292 Optimization of the Measure of Compromise as a Version of Sorites Paradox
Authors: Aleksandar Hatzivelkos
Abstract:
The term ”compromise” is mostly used casually within the social choice theory. It is usually used as a mere result of the social choice function, and this omits its deeper meaning and ramifications. This paper is based on a mathematical model for the description of a compromise as a version of the Sorites paradox. It introduces a formal definition of d-measure of divergence from a compromise and models a notion of compromise that is often used only colloquially. Such a model for vagueness phenomenon, which lies at the core of the notion of compromise enables the introduction of new mathematical structures. In order to maximize compromise, different methods can be used. In this paper, we explore properties of a social welfare function TdM (from Total d-Measure), which is defined as a function which minimizes the total sum of d-measures of divergence over all possible linear orderings. We prove that TdM satisfy strict Pareto principle and behaves well asymptotically. Furthermore, we show that for certain domain restrictions, TdM satisfy positive responsiveness and IIIA (intense independence of irrelevant alternatives) thus being equivalent to Borda count on such domain restriction. This result gives new opportunities in social choice, especially when there is an emphasis on compromise in the decision-making process.Keywords: borda count, compromise, measure of divergence, minimization
Procedia PDF Downloads 1353291 A Similarity/Dissimilarity Measure to Biological Sequence Alignment
Authors: Muhammad A. Khan, Waseem Shahzad
Abstract:
Analysis of protein sequences is carried out for the purpose to discover their structural and ancestry relationship. Sequence similarity determines similar protein structures, similar function, and homology detection. Biological sequences composed of amino acid residues or nucleotides provide significant information through sequence alignment. In this paper, we present a new similarity/dissimilarity measure to sequence alignment based on the primary structure of a protein. The approach finds the distance between the two given sequences using the novel sequence alignment algorithm and a mathematical model. The algorithm runs at a time complexity of O(n²). A distance matrix is generated to construct a phylogenetic tree of different species. The new similarity/dissimilarity measure outperforms other existing methods.Keywords: alignment, distance, homology, mathematical model, phylogenetic tree
Procedia PDF Downloads 1783290 New Approaches for the Handwritten Digit Image Features Extraction for Recognition
Authors: U. Ravi Babu, Mohd Mastan
Abstract:
The present paper proposes a novel approach for handwritten digit recognition system. The present paper extract digit image features based on distance measure and derives an algorithm to classify the digit images. The distance measure can be performing on the thinned image. Thinning is the one of the preprocessing technique in image processing. The present paper mainly concentrated on an extraction of features from digit image for effective recognition of the numeral. To find the effectiveness of the proposed method tested on MNIST database, CENPARMI, CEDAR, and newly collected data. The proposed method is implemented on more than one lakh digit images and it gets good comparative recognition results. The percentage of the recognition is achieved about 97.32%.Keywords: handwritten digit recognition, distance measure, MNIST database, image features
Procedia PDF Downloads 4623289 Market Illiquidity and Pricing Errors in the Term Structure of CDS
Authors: Lidia Sanchis-Marco, Antonio Rubia, Pedro Serrano
Abstract:
This paper studies the informational content of pricing errors in the term structure of sovereign CDS spreads. The residuals from a non-arbitrage model are employed to construct a Price discrepancy estimate, or noise measure. The noise estimate is understood as an indicator of market distress and reflects frictions such as illiquidity. Empirically, the noise measure is computed for an extensive panel of CDS spreads. Our results reveal an important fraction of systematic risk is not priced in default swap contracts. When projecting the noise measure onto a set of financial variables, the panel-data estimates show that greater price discrepancies are systematically related to a higher level of offsetting transactions of CDS contracts. This evidence suggests that arbitrage capital flows exit the marketplace during time of distress, and this consistent with a market segmentation among investors and arbitrageurs where professional arbitrageurs are particularly ineffective at bringing prices to their fundamental values during turbulent periods. Our empirical findings are robust for the most common CDS pricing models employed in the industry.Keywords: credit default swaps, noise measure, illiquidity, capital arbitrage
Procedia PDF Downloads 569