Search results for: complexity measure
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4700

Search results for: complexity measure

4520 An Approach for Multilayered Ecological Networks

Authors: N. F. F. Ebecken, G. C. Pereira

Abstract:

Although networks provide a powerful approach to the study of a wide variety of ecological systems, their formulation usually does not include various types of interactions, interactions that vary in space and time, and interconnected systems such as networks. The emerging field of 'multilayer networks' provides a natural framework for extending ecological systems analysis to include these multiple layers of complexity as it specifically allows for differentiation and modeling of intralayer and interlayer connectivity. The structure provides a set of concepts and tools that can be adapted and applied to the ecology, facilitating research in high dimensionality, heterogeneous systems in nature. Here, ecological multilayer networks are formally defined based on a review of prior and related approaches, illustrates their application and potential with existing data analyzes, and discusses limitations, challenges, and future applications. The integration of multilayer network theory into ecology offers a largely untapped potential to further address ecological complexity, to finally provide new theoretical and empirical insights into the architecture and dynamics of ecological systems.

Keywords: ecological networks, multilayered networks, sea ecology, Brazilian Coastal Area

Procedia PDF Downloads 117
4519 Efficient Frontier: Comparing Different Volatility Estimators

Authors: Tea Poklepović, Zdravka Aljinović, Mario Matković

Abstract:

Modern Portfolio Theory (MPT) according to Markowitz states that investors form mean-variance efficient portfolios which maximizes their utility. Markowitz proposed the standard deviation as a simple measure for portfolio risk and the lower semi-variance as the only risk measure of interest to rational investors. This paper uses a third volatility estimator based on intraday data and compares three efficient frontiers on the Croatian Stock Market. The results show that range-based volatility estimator outperforms both mean-variance and lower semi-variance model.

Keywords: variance, lower semi-variance, range-based volatility, MPT

Procedia PDF Downloads 483
4518 Hardware Implementation and Real-time Experimental Validation of a Direction of Arrival Estimation Algorithm

Authors: Nizar Tayem, AbuMuhammad Moinuddeen, Ahmed A. Hussain, Redha M. Radaydeh

Abstract:

This research paper introduces an approach for estimating the direction of arrival (DOA) of multiple RF noncoherent sources in a uniform linear array (ULA). The proposed method utilizes a Capon-like estimation algorithm and incorporates LU decomposition to enhance the accuracy of DOA estimation while significantly reducing computational complexity compared to existing methods like the Capon method. Notably, the proposed method does not require prior knowledge of the number of sources. To validate its effectiveness, the proposed method undergoes validation through both software simulations and practical experimentation on a prototype testbed constructed using a software-defined radio (SDR) platform and GNU Radio software. The results obtained from MATLAB simulations and real-time experiments provide compelling evidence of the proposed method's efficacy.

Keywords: DOA estimation, real-time validation, software defined radio, computational complexity, Capon's method, GNU radio

Procedia PDF Downloads 41
4517 Developing Fault Tolerance Metrics of Web and Mobile Applications

Authors: Ahmad Mohsin, Irfan Raza Naqvi, Syda Fatima Usamn

Abstract:

Applications with higher fault tolerance index are considered more reliable and trustworthy to drive quality. In recent years application development has been shifted from traditional desktop and web to native and hybrid application(s) for the web and mobile platforms. With the emergence of Internet of things IOTs, cloud and big data trends, the need for measuring Fault Tolerance for these complex nature applications has increased to evaluate their performance. There is a phenomenal gap between fault tolerance metrics development and measurement. Classic quality metric models focused on metrics for traditional systems ignoring the essence of today’s applications software, hardware & deployment characteristics. In this paper, we have proposed simple metrics to measure fault tolerance considering general requirements for Web and Mobile Applications. We have aligned factors – subfactors, using GQM for metrics development considering the nature of mobile we apps. Systematic Mathematical formulation is done to measure metrics quantitatively. Three web mobile applications are selected to measure Fault Tolerance factors using formulated metrics. Applications are then analysed on the basis of results from observations in a controlled environment on different mobile devices. Quantitative results are presented depicting Fault tolerance in respective applications.

Keywords: web and mobile applications, reliability, fault tolerance metric, quality metrics, GQM based metrics

Procedia PDF Downloads 301
4516 Identifying Factors for Evaluating Livability Potential within a Metropolis: A Case of Kolkata

Authors: Arpan Paul, Joy Sen

Abstract:

Livability is a holistic concept whose factors include many complex characteristics and levels of interrelationships among them. It has been considered as people’s need for public amenities and is recognized as a major element to create social welfare. The concept and principles of livability are essential for recognizing the significance of community well-being. The attributes and dimensions of livability are also important aspects to measure the overall quality of environment. Livability potential is mainly considered as the capacity to develop into the overall well-being of an urban area in future. The intent of the present study is to identify the prime factors to evaluate livability potential within a metropolis. For ground level case study, the paper has selected Kolkata Metropolitan Area (KMA) as it has wide physical, social, and economic variations within it. The initial part of the study deals with detailed literature review on livability and its significance of evaluating its potential within a metropolis. The next segment is dedicated for identifying the primary factors which would evaluate livability potential within a metropolis. In pursuit of identifying primary factors, which have a direct impact on urban livability, this study delineates the metropolitan area into various clusters, having their distinct livability potential. As a final outcome of the study, variations of livability potential of those selected clusters are highlighted to explain the complexity of the metropolitan development.

Keywords: Kolkata Metropolitan Area (KMA), livability potential, metropolis, wellbeing

Procedia PDF Downloads 244
4515 Reexamining Contrarian Trades as a Proxy of Informed Trades: Evidence from China's Stock Market

Authors: Dongqi Sun, Juan Tao, Yingying Wu

Abstract:

This paper reexamines the appropriateness of contrarian trades as a proxy of informed trades, using high frequency Chinese stock data. Employing this measure for 5 minute intervals, a U-shaped intraday pattern of probability of informed trades (PIN) is found for the CSI300 stocks, which is consistent with previous findings for other markets. However, while dividing the trades into different sizes, a reversed U-shaped PIN from large-sized trades, opposed to the U-shaped pattern for small- and medium-sized trades, is observed. Drawing from the mixed evidence with different trade sizes, the price impact of trades is further investigated. By examining the relationship between trade imbalances and unexpected returns, larges-sized trades are found to have significant price impact. This implies that in those intervals with large trades, it is non-contrarian trades that are more likely to be informed trades. Taking account of the price impact of large-sized trades, non-contrarian trades are used to proxy for informed trading in those intervals with large trades, and contrarian trades are still used to measure informed trading in other intervals. A stronger U-shaped PIN is demonstrated from this modification. Auto-correlation and information advantage tests for robustness also support the modified informed trading measure.

Keywords: contrarian trades, informed trading, price impact, trade imbalance

Procedia PDF Downloads 137
4514 Callous-Unemotional Traits in Preschoolers: Distinct Associations with Empathy Subcomponents

Authors: E. Stylianopoulou, A. K. Fanti

Abstract:

Object: Children scoring high on Callous-Unemotional traits (CU traits) exhibit lack of empathy. More specifically, children scoring high on CU traits appear to exhibit deficits on affective empathy or deficits in other constructs. However, little is known about cognitive empathy, and it's relation with CU traits in preschoolers. Despite the fact that empathy is measurable at a very young age, relatively less study has focused on empathy in preschoolers than older children with CU traits. The present study examines the cognitive and affective empathy in preschoolers with CU traits. The aim was to examine the differences between cognitive and affective empathy in those individuals. Based on previous research in children with CU traits, it was hypothesized that preschoolers scoring high in CU traits will show deficits in both cognitive and affective empathy; however, more deficits will be detected in affective empathy rather than cognitive empathy. Method: The sample size was 209 children, of which 109 were male, and 100 were female between the ages of 3 and 7 (M=4.73, SD=0.71). From those participants, only 175 completed all the items. The Inventory of Callous-Unemotional traits was used to measure CU traits. Moreover, the Griffith Empathy Measure (GEM) Affective Scale and the Griffith Empathy Measure (GEM) Cognitive Scale was used to measure Affective and Cognitive empathy, respectively. Results: Linear Regression was applied to examine the preceding hypotheses. The results showed that generally, there was a moderate negative association between CU traits and empathy, which was significant. More specifically, it has been found that there was a significant and negative moderate relation between CU traits and cognitive empathy. Surprisingly, results indicated that there was no significant relation between CU traits and affective empathy. Conclusion: The current findings support that preschoolers show deficits in understanding others emotions, indicating a significant association between CU traits and cognitive empathy. However, such a relation was not found between CU traits and affective empathy. The current results raised the importance that there is a need for focusing more on cognitive empathy in preschoolers with CU traits, a component that seems to be underestimated till now.

Keywords: affective empathy, callous-unemotional traits, cognitive empathy, preschoolers

Procedia PDF Downloads 122
4513 Smart Production Planning: The Case of Aluminium Foundry

Authors: Samira Alvandi

Abstract:

In the context of the circular economy, production planning aims to eliminate waste and emissions and maximize resource efficiency. Historically production planning is challenged through arrays of uncertainty and complexity arising from the interdependence and variability of products, processes, and systems. Manufacturers worldwide are facing new challenges in tackling various environmental issues such as climate change, resource depletion, and land degradation. In managing the inherited complexity and uncertainty and yet maintaining profitability, the manufacturing sector is in need of a holistic framework that supports energy efficiency and carbon emission reduction schemes. The proposed framework addresses the current challenges and integrates simulation modeling with optimization for finding optimal machine-job allocation to maximize throughput and total energy consumption while minimizing lead time. The aluminium refinery facility in western Sydney, Australia, is used as an exemplar to validate the proposed framework.

Keywords: smart production planning, simulation-optimisation, energy aware capacity planning, energy intensive industries

Procedia PDF Downloads 32
4512 Show Products or Show Endorsers: Immersive Visual Experience in Fashion Advertisements on Instagram

Authors: H. Haryati, A. Nor Azura

Abstract:

Over the turn of the century, the advertising landscape has evolved significantly, from print media to digital media. In line with the shift to the advanced science and technology dramatically shake the framework of societies Fifth Industrial Revolution (IR5.0), technological endeavors have increased exponentially, which influenced user interaction more inspiring through online advertising that intentionally leads to buying behavior. Users are more accustomed to interactive content that responds to their actions. Thus, immersive experience has transformed into a new engagement experience To centennials. The purpose of this paper is to investigate pleasure and arousal as the fundamental elements of consumer emotions and affective responses to marketing stimuli. A quasi-experiment procedure will be adopted in the research involving 40 undergraduate students in Nilai, Malaysia. This study employed a 2 (celebrity endorser vs. Social media influencer) X 2 (high and low visual complexity) factorial between-subjects design. Participants will be exposed to a printed version depicting a fashion product endorsed by a celebrity and social media influencers, presented in high and low levels of visual complexity. While the questionnaire will be Distributing during the lab test session is used to control their honesty, real feedback, and responses through the latest Instagram design and engagement. Therefore, the research aims to define the immersive experience on Instagram and the interaction between pleasure and arousal. An advertisement that evokes pleasure and arousal will be likely getting more attention from the target audience. This is one of the few studies comparing the endorses in Instagram advertising. Also, this research extends the existing knowledge about the immersive visual complexity in the context of social media advertising.

Keywords: immersive visual experience, instagram, pleasure, arousal

Procedia PDF Downloads 147
4511 Modified Clusterwise Regression for Pavement Management

Authors: Mukesh Khadka, Alexander Paz, Hanns de la Fuente-Mella

Abstract:

Typically, pavement performance models are developed in two steps: (i) pavement segments with similar characteristics are grouped together to form a cluster, and (ii) the corresponding performance models are developed using statistical techniques. A challenge is to select the characteristics that define clusters and the segments associated with them. If inappropriate characteristics are used, clusters may include homogeneous segments with different performance behavior or heterogeneous segments with similar performance behavior. Prediction accuracy of performance models can be improved by grouping the pavement segments into more uniform clusters by including both characteristics and a performance measure. This grouping is not always possible due to limited information. It is impractical to include all the potential significant factors because some of them are potentially unobserved or difficult to measure. Historical performance of pavement segments could be used as a proxy to incorporate the effect of the missing potential significant factors in clustering process. The current state-of-the-art proposes Clusterwise Linear Regression (CLR) to determine the pavement clusters and the associated performance models simultaneously. CLR incorporates the effect of significant factors as well as a performance measure. In this study, a mathematical program was formulated for CLR models including multiple explanatory variables. Pavement data collected recently over the entire state of Nevada were used. International Roughness Index (IRI) was used as a pavement performance measure because it serves as a unified standard that is widely accepted for evaluating pavement performance, especially in terms of riding quality. Results illustrate the advantage of the using CLR. Previous studies have used CLR along with experimental data. This study uses actual field data collected across a variety of environmental, traffic, design, and construction and maintenance conditions.

Keywords: clusterwise regression, pavement management system, performance model, optimization

Procedia PDF Downloads 226
4510 Clinical Use of Opioid Analgesics in China: An Adequacy of Consumption Measure

Authors: Mengjia Zhi, Xingmei Wei, Xiang Gao, Shiyang Liu, Zhiran Huang, Li Yang, Jing Sun

Abstract:

Background: To understand the consumption trend of opioid analgesics and the consumption adequacy of opioid analgesic treatment for moderate to severe pain in China, as well as the pain control level of China with international perspective. Importance: To author’s best knowledge, this is the first study in China to measure the adequacy of opioid analgesic treatment for moderate to severe pain considering disease pattern and with the standardized pain treatment guideline. Methods: A retrospective analysis was carried out to show the consumption frequency (daily defined doses, DDDs) of opioid analgesics and its trend in China from 2006 to 2016. Adequacy of consumption measure (ACM) was used to measure the number of needed morphine equivalents and the overall adequacy of opioid analgesic treatment of moderate to severe pain in China, and compared with international data. Results: The consumption frequency of opioid analgesics (DDDs) in China increased from 13,200,000 DDDs in 2006 to 44,200,000 DDDs in 2016, and showed an increasing trend. The growth rate was faster at first, especially in 2013, then slowed down, decreased slightly in 2015. The ACM of China increased from 0.0032 in 2006 to 0.0074 in 2016, with an overall trend of growth. The ACM level of China has been always a very poor level during 2006-2016. Conclusion: The consumption of opioid analgesics for the treatment of moderate to severe pain in China has always been inadequate. There is a huge gap between China and the international level. There are many reasons behind this problem, which lie in different aspects, including medical staff, patients and the public, health systems and social & cultural aspects. It is necessary to strengthen the training and education of medical staff and the patients, to use mass media to disseminate scientific knowledge of pain management, to encourage communications between doctors and patients, to improve regulatory system for the controlled medicines and the overall health systems, and to balance the regulatory goal for avoidance of abuse, and the social goal of meeting the increasing needs of the people for better life.

Keywords: opioid analgesics, adequate consumption measure, pain control, China

Procedia PDF Downloads 182
4509 Algorithms Minimizing Total Tardiness

Authors: Harun Aydilek, Asiye Aydilek, Ali Allahverdi

Abstract:

The total tardiness is a widely used performance measure in the scheduling literature. This performance measure is particularly important in situations where there is a cost to complete a job beyond its due date. The cost of scheduling increases as the gap between a job's due date and its completion time increases. Such costs may also be penalty costs in contracts, loss of goodwill. This performance measure is important as the fulfillment of due dates of customers has to be taken into account while making scheduling decisions. The problem is addressed in the literature, however, it has been assumed zero setup times. Even though this assumption may be valid for some environments, it is not valid for some other scheduling environments. When setup times are treated as separate from processing times, it is possible to increase machine utilization and to reduce total tardiness. Therefore, non-zero setup times need to be considered as separate. A dominance relation is developed and several algorithms are proposed. The developed dominance relation is utilized in the proposed algorithms. Extensive computational experiments are conducted for the evaluation of the algorithms. The experiments indicated that the developed algorithms perform much better than the existing algorithms in the literature. More specifically, one of the newly proposed algorithms reduces the error of the best existing algorithm in the literature by 40 percent.

Keywords: algorithm, assembly flowshop, dominance relation, total tardiness

Procedia PDF Downloads 326
4508 Comparing Friction Force Between Track and Spline Using graphite, Mos2, PTFE, and Silicon Dry Lubricant

Authors: M. De Maaijer, Wenxuan Shi, , Dolores Pose, Ditmar, F. Barati

Abstract:

Friction has several detrimental effects on Blind performance, Therefore Ziptak company as the leading company in the blind manufacturing sector, start investigating on how to conquer this problem in next generation of blinds. This problem is more sever in extremely sever condition. Although in these condition Ziptrak suggest not to use the blind, working on blind and its associated parts was the priority of Ziptrak company. The purpose of this article is to measure the effects of lubrication process on reducing friction force between spline and track especially at windy conditions Four different lubricants were implicated to measure their efficiency on reducing friction force.

Keywords: libricant, ziptrak, blind, spline

Procedia PDF Downloads 56
4507 Explicit Numerical Approximations for a Pricing Weather Derivatives Model

Authors: Clarinda V. Nhangumbe, Ercília Sousa

Abstract:

Weather Derivatives are financial instruments used to cover non-catastrophic weather events and can be expressed in the form of standard or plain vanilla products, structured or exotics products. The underlying asset, in this case, is the weather index, such as temperature, rainfall, humidity, wind, and snowfall. The complexity of the Weather Derivatives structure shows the weakness of the Black Scholes framework. Therefore, under the risk-neutral probability measure, the option price of a weather contract can be given as a unique solution of a two-dimensional partial differential equation (parabolic in one direction and hyperbolic in other directions), with an initial condition and subjected to adequate boundary conditions. To calculate the price of the option, one can use numerical methods such as the Monte Carlo simulations and implicit finite difference schemes conjugated with Semi-Lagrangian methods. This paper is proposed two explicit methods, namely, first-order upwind in the hyperbolic direction combined with Lax-Wendroff in the parabolic direction and first-order upwind in the hyperbolic direction combined with second-order upwind in the parabolic direction. One of the advantages of these methods is the fact that they take into consideration the boundary conditions obtained from the financial interpretation and deal efficiently with the different choices of the convection coefficients.

Keywords: incomplete markets, numerical methods, partial differential equations, stochastic process, weather derivatives

Procedia PDF Downloads 64
4506 A System Dynamics Model for Assessment of Alternative Energy Policy Measures: A Case of Energy Management System as an Energy Efficiency Policy Tool

Authors: Andra Blumberga, Uldis Bariss, Anna Kubule, Dagnija Blumberga

Abstract:

European Union Energy Efficiency Directive provides a set of binding energy efficiency measures to reach. Each of the member states can use either energy efficiency obligation scheme or alternative policy measures or combination of both. Latvian government has decided to divide savings among obligation scheme (65%) and alternative measures (35%). This decision might lead to significant energy tariff increase hence impact on the national economy. To assess impact of alternative policy measures focusing on energy management scheme based on ISO 50001 and ability to decrease share of obligation scheme a System Dynamics modeling was used. Simulation results show that energy efficiency goal can be met with alternative policy measure to large energy consumers in industrial, tertiary and public sectors by applying the energy tax exemption for implementers of energy management system. A delay in applying alternative policy measures plays very important role in reaching the energy efficiency goal. One year delay in implementation of this policy measure reduces cumulative energy savings from 2016 to 2017 from 5200 GWh to 3000 GWh in 2020.

Keywords: system dynamics, energy efficiency, policy measure, energy management system, obligation scheme

Procedia PDF Downloads 252
4505 Variations in Spatial Learning and Memory across Natural Populations of Zebrafish, Danio rerio

Authors: Tamal Roy, Anuradha Bhat

Abstract:

Cognitive abilities aid fishes in foraging, avoiding predators & locating mates. Factors like predation pressure & habitat complexity govern learning & memory in fishes. This study aims to compare spatial learning & memory across four natural populations of zebrafish. Zebrafish, a small cyprinid inhabits a diverse range of freshwater habitats & this makes it amenable to studies investigating role of native environment in spatial cognitive abilities. Four populations were collected across India from waterbodies with contrasting ecological conditions. Habitat complexity of the water-bodies was evaluated as a combination of channel substrate diversity and diversity of vegetation. Experiments were conducted on populations under controlled laboratory conditions. A square shaped spatial testing arena (maze) was constructed for testing the performance of adult zebrafish. The square tank consisted of an inner square shaped layer with the edges connected to the diagonal ends of the tank-walls by connections thereby forming four separate chambers. Each of the four chambers had a main door in the centre. Each chamber had three sections separated by two windows. A removable coloured window-pane (red, yellow, green or blue) identified each main door. A food reward associated with an artificial plant was always placed inside the left-hand section of the red-door chamber. The position of food-reward and plant within the red-door chamber was fixed. A test fish would have to explore the maze by taking turns and locate the food inside the right-side section of the red-door chamber. Fishes were sorted from each population stock and kept individually in separate containers for identification. At a time, a test fish was released into the arena and allowed 20 minutes to explore in order to find the food-reward. In this way, individual fishes were trained through the maze to locate the food reward for eight consecutive days. The position of red door, with the plant and the reward, was shuffled every day. Following training, an intermission of four days was given during which the fishes were not subjected to trials. Post-intermission, the fishes were re-tested on the 13th day following the same protocol for their ability to remember the learnt task. Exploratory tendencies and latency of individuals to explore on 1st day of training, performance time across trials, and number of mistakes made each day were recorded. Additionally, mechanism used by individuals to solve the maze each day was analyzed across populations. Fishes could be expected to use algorithm (sequence of turns) or associative cues in locating the food reward. Individuals of populations did not differ significantly in latencies and tendencies to explore. No relationship was found between exploration and learning across populations. High habitat-complexity populations had higher rates of learning & stronger memory while low habitat-complexity populations had lower rates of learning and much reduced abilities to remember. High habitat-complexity populations used associative cues more than algorithm for learning and remembering while low habitat-complexity populations used both equally. The study, therefore, helped understand the role of natural ecology in explaining variations in spatial learning abilities across populations.

Keywords: algorithm, associative cue, habitat complexity, population, spatial learning

Procedia PDF Downloads 266
4504 Quick Similarity Measurement of Binary Images via Probabilistic Pixel Mapping

Authors: Adnan A. Y. Mustafa

Abstract:

In this paper we present a quick technique to measure the similarity between binary images. The technique is based on a probabilistic mapping approach and is fast because only a minute percentage of the image pixels need to be compared to measure the similarity, and not the whole image. We exploit the power of the Probabilistic Matching Model for Binary Images (PMMBI) to arrive at an estimate of the similarity. We show that the estimate is a good approximation of the actual value, and the quality of the estimate can be improved further with increased image mappings. Furthermore, the technique is image size invariant; the similarity between big images can be measured as fast as that for small images. Examples of trials conducted on real images are presented.

Keywords: big images, binary images, image matching, image similarity

Procedia PDF Downloads 163
4503 Accountability of Artificial Intelligence: An Analysis Using Edgar Morin’s Complex Thought

Authors: Sylvie Michel, Sylvie Gerbaix, Marc Bidan

Abstract:

Artificial intelligence (AI) can be held accountable for its detrimental impacts. This question gains heightened relevance given AI's pervasive reach across various domains, magnifying its power and potential. The expanding influence of AI raises fundamental ethical inquiries, primarily centering on biases, responsibility, and transparency. This encompasses discriminatory biases arising from algorithmic criteria or data, accidents attributed to autonomous vehicles or other systems, and the imperative of transparent decision-making. This article aims to stimulate reflection on AI accountability, denoting the necessity to elucidate the effects it generates. Accountability comprises two integral aspects: adherence to legal and ethical standards and the imperative to elucidate the underlying operational rationale. The objective is to initiate a reflection on the obstacles to this "accountability," facing the challenges of the complexity of artificial intelligence's system and its effects. Then, this article proposes to mobilize Edgar Morin's complex thought to encompass and face the challenges of this complexity. The first contribution is to point out the challenges posed by the complexity of A.I., with fractional accountability between a myriad of human and non-human actors, such as software and equipment, which ultimately contribute to the decisions taken and are multiplied in the case of AI. Accountability faces three challenges resulting from the complexity of the ethical issues combined with the complexity of AI. The challenge of the non-neutrality of algorithmic systems as fully ethically non-neutral actors is put forward by a revealing ethics approach that calls for assigning responsibilities to these systems. The challenge of the dilution of responsibility is induced by the multiplicity and distancing between the actors. Thus, a dilution of responsibility is induced by a split in decision-making between developers, who feel they fulfill their duty by strictly respecting the requests they receive, and management, which does not consider itself responsible for technology-related flaws. Accountability is confronted with the challenge of transparency of complex and scalable algorithmic systems, non-human actors self-learning via big data. A second contribution involves leveraging E. Morin's principles, providing a framework to grasp the multifaceted ethical dilemmas and subsequently paving the way for establishing accountability in AI. When addressing the ethical challenge of biases, the "hologrammatic" principle underscores the imperative of acknowledging the non-ethical neutrality of algorithmic systems inherently imbued with the values and biases of their creators and society. The "dialogic" principle advocates for the responsible consideration of ethical dilemmas, encouraging the integration of complementary and contradictory elements in solutions from the very inception of the design phase. Aligning with the principle of organizing recursiveness, akin to the "transparency" of the system, it promotes a systemic analysis to account for the induced effects and guides the incorporation of modifications into the system to rectify deviations and reintroduce modifications into the system to rectify its drifts. In conclusion, this contribution serves as an inception for contemplating the accountability of "artificial intelligence" systems despite the evident ethical implications and potential deviations. Edgar Morin's principles, providing a lens to contemplate this complexity, offer valuable perspectives to address these challenges concerning accountability.

Keywords: accountability, artificial intelligence, complexity, ethics, explainability, transparency, Edgar Morin

Procedia PDF Downloads 33
4502 From Mobility to Complexity: French Language Use among Algerian Doctoral Postgraduates in Scotland

Authors: Hadjer Chellia

Abstract:

The study explores the phenomenon of second language use in a migratory setting and uses the case of Algerian international students in Scotland, United Kingdom. The linguistic history of Algeria reveals that French language has a high status among the Algerians’ verbal repertoires and Algerian English students consider it as a language of prestige. With mobility of some of these students towards Scotland -in the guise of internationalization of higher education, mobility and exchange programs, the transition was deemed to bring more complexity to their pre-migratory linguistic repertoires and resulted into their French language- being endangered and threatened by a potential shift to English. The study employed semi-structured interviews among six Ph.D. ethnically related students, and the main aim behind that is to explore their current experiences with regards to French language use and to provide an account of the factors which assist in shifting to English as a second language instead. The six participants identified in interviews were further invited to focus group sessions based on an in-group interaction fashion to discuss different topics using heritage languages. This latter was opted for as part of the methodology as a means to observe their real linguistic practice and to investigate the link between behaviors and previous perceptions. The findings detect a variety of social, individual and socio-psychological factors that would contribute in refining the concept of language shift among newly established émigré communities with short stay vis a vis the linguistic outcomes of immigrants with long stay, across generational basis that was –to some extent-the focus of previous research on language shift. The results further reveal a mismatch between students' perceptions and observed behaviors. The research is then largely relevant to international students’ sociolinguistic experience of study abroad.

Keywords: complexity, mobility, potential shift, sociolinguistic experience

Procedia PDF Downloads 133
4501 Performance and Limitations of Likelihood Based Information Criteria and Leave-One-Out Cross-Validation Approximation Methods

Authors: M. A. C. S. Sampath Fernando, James M. Curran, Renate Meyer

Abstract:

Model assessment, in the Bayesian context, involves evaluation of the goodness-of-fit and the comparison of several alternative candidate models for predictive accuracy and improvements. In posterior predictive checks, the data simulated under the fitted model is compared with the actual data. Predictive model accuracy is estimated using information criteria such as the Akaike information criterion (AIC), the Bayesian information criterion (BIC), the Deviance information criterion (DIC), and the Watanabe-Akaike information criterion (WAIC). The goal of an information criterion is to obtain an unbiased measure of out-of-sample prediction error. Since posterior checks use the data twice; once for model estimation and once for testing, a bias correction which penalises the model complexity is incorporated in these criteria. Cross-validation (CV) is another method used for examining out-of-sample prediction accuracy. Leave-one-out cross-validation (LOO-CV) is the most computationally expensive variant among the other CV methods, as it fits as many models as the number of observations. Importance sampling (IS), truncated importance sampling (TIS) and Pareto-smoothed importance sampling (PSIS) are generally used as approximations to the exact LOO-CV and utilise the existing MCMC results avoiding expensive computational issues. The reciprocals of the predictive densities calculated over posterior draws for each observation are treated as the raw importance weights. These are in turn used to calculate the approximate LOO-CV of the observation as a weighted average of posterior densities. In IS-LOO, the raw weights are directly used. In contrast, the larger weights are replaced by their modified truncated weights in calculating TIS-LOO and PSIS-LOO. Although, information criteria and LOO-CV are unable to reflect the goodness-of-fit in absolute sense, the differences can be used to measure the relative performance of the models of interest. However, the use of these measures is only valid under specific circumstances. This study has developed 11 models using normal, log-normal, gamma, and student’s t distributions to improve the PCR stutter prediction with forensic data. These models are comprised of four with profile-wide variances, four with locus specific variances, and three which are two-component mixture models. The mean stutter ratio in each model is modeled as a locus specific simple linear regression against a feature of the alleles under study known as the longest uninterrupted sequence (LUS). The use of AIC, BIC, DIC, and WAIC in model comparison has some practical limitations. Even though, IS-LOO, TIS-LOO, and PSIS-LOO are considered to be approximations of the exact LOO-CV, the study observed some drastic deviations in the results. However, there are some interesting relationships among the logarithms of pointwise predictive densities (lppd) calculated under WAIC and the LOO approximation methods. The estimated overall lppd is a relative measure that reflects the overall goodness-of-fit of the model. Parallel log-likelihood profiles for the models conditional on equal posterior variances in lppds were observed. This study illustrates the limitations of the information criteria in practical model comparison problems. In addition, the relationships among LOO-CV approximation methods and WAIC with their limitations are discussed. Finally, useful recommendations that may help in practical model comparisons with these methods are provided.

Keywords: cross-validation, importance sampling, information criteria, predictive accuracy

Procedia PDF Downloads 364
4500 Complex Decision Rules in the Form of Decision Trees

Authors: Avinash S. Jagtap, Sharad D. Gore, Rajendra G. Gurao

Abstract:

Decision rules become more and more complex as the number of conditions increase. As a consequence, the complexity of the decision rule also influences the time complexity of computer implementation of such a rule. Consider, for example, a decision that depends on four conditions A, B, C and D. For simplicity, suppose each of these four conditions is binary. Even then the decision rule will consist of 16 lines, where each line will be of the form: If A and B and C and D, then action 1. If A and B and C but not D, then action 2 and so on. While executing this decision rule, each of the four conditions will be checked every time until all the four conditions in a line are satisfied. The minimum number of logical comparisons is 4 whereas the maximum number is 64. This paper proposes to present a complex decision rule in the form of a decision tree. A decision tree divides the cases into branches every time a condition is checked. In the form of a decision tree, every branching eliminates half of the cases that do not satisfy the related conditions. As a result, every branch of the decision tree involves only four logical comparisons and hence is significantly simpler than the corresponding complex decision rule. The conclusion of this paper is that every complex decision rule can be represented as a decision tree and the decision tree is mathematically equivalent but computationally much simpler than the original complex decision rule

Keywords: strategic, tactical, operational, adaptive, innovative

Procedia PDF Downloads 249
4499 Low Density Parity Check Codes

Authors: Kassoul Ilyes

Abstract:

The field of error correcting codes has been revolutionized by the introduction of iteratively decoded codes. Among these, LDPC codes are now a preferred solution thanks to their remarkable performance and low complexity. The binary version of LDPC codes showed even better performance, although it’s decoding introduced greater complexity. This thesis studies the performance of binary LDPC codes using simplified weighted decisions. Information is transported between a transmitter and a receiver by digital transmission systems, either by propagating over a radio channel or also by using a transmission medium such as the transmission line. The purpose of the transmission system is then to carry the information from the transmitter to the receiver as reliably as possible. These codes have not generated enough interest within the coding theory community. This forgetfulness will last until the introduction of Turbo-codes and the iterative principle. Then it was proposed to adopt Pearl's Belief Propagation (BP) algorithm for decoding these codes. Subsequently, Luby introduced irregular LDPC codes characterized by a parity check matrix. And finally, we study simplifications on binary LDPC codes. Thus, we propose a method to make the exact calculation of the APP simpler. This method leads to simplifying the implementation of the system.

Keywords: LDPC, parity check matrix, 5G, BER, SNR

Procedia PDF Downloads 128
4498 Using the Technology Acceptance Model to Examine Seniors’ Attitudes toward Facebook

Authors: Chien-Jen Liu, Shu Ching Yang

Abstract:

Using the technology acceptance model (TAM), this study examined the external variables of technological complexity (TC) to acquire a better understanding of the factors that influence the acceptance of computer application courses by learners at Active Aging Universities. After the learners in this study had completed a 27-hour Facebook course, 44 learners responded to a modified TAM survey. Data were collected to examine the path relationships among the variables that influence the acceptance of Facebook-mediated community learning. The partial least squares (PLS) method was used to test the measurement and the structural model. The study results demonstrated that attitudes toward Facebook use directly influence behavioral intentions (BI) with respect to Facebook use, evincing a high prediction rate of 58.3%. In addition to the perceived usefulness (PU) and perceived ease of use (PEOU) measures that are proposed in the TAM, other external variables, such as TC, also indirectly influence BI. These four variables can explain 88% of the variance in BI and demonstrate a high level of predictive ability. Finally, limitations of this investigation and implications for further research are discussed.

Keywords: technology acceptance model (TAM), technological complexity, partial least squares (PLS), perceived usefulness

Procedia PDF Downloads 315
4497 Fuzzy Total Factor Productivity by Credibility Theory

Authors: Shivi Agarwal, Trilok Mathur

Abstract:

This paper proposes the method to measure the total factor productivity (TFP) change by credibility theory for fuzzy input and output variables. Total factor productivity change has been widely studied with crisp input and output variables, however, in some cases, input and output data of decision-making units (DMUs) can be measured with uncertainty. These data can be represented as linguistic variable characterized by fuzzy numbers. Malmquist productivity index (MPI) is widely used to estimate the TFP change by calculating the total factor productivity of a DMU for different time periods using data envelopment analysis (DEA). The fuzzy DEA (FDEA) model is solved using the credibility theory. The results of FDEA is used to measure the TFP change for fuzzy input and output variables. Finally, numerical examples are presented to illustrate the proposed method to measure the TFP change input and output variables. The suggested methodology can be utilized for performance evaluation of DMUs and help to assess the level of integration. The methodology can also apply to rank the DMUs and can find out the DMUs that are lagging behind and make recommendations as to how they can improve their performance to bring them at par with other DMUs.

Keywords: chance-constrained programming, credibility theory, data envelopment analysis, fuzzy data, Malmquist productivity index

Procedia PDF Downloads 326
4496 Optimizing the Capacity of a Convolutional Neural Network for Image Segmentation and Pattern Recognition

Authors: Yalong Jiang, Zheru Chi

Abstract:

In this paper, we study the factors which determine the capacity of a Convolutional Neural Network (CNN) model and propose the ways to evaluate and adjust the capacity of a CNN model for best matching to a specific pattern recognition task. Firstly, a scheme is proposed to adjust the number of independent functional units within a CNN model to make it be better fitted to a task. Secondly, the number of independent functional units in the capsule network is adjusted to fit it to the training dataset. Thirdly, a method based on Bayesian GAN is proposed to enrich the variances in the current dataset to increase its complexity. Experimental results on the PASCAL VOC 2010 Person Part dataset and the MNIST dataset show that, in both conventional CNN models and capsule networks, the number of independent functional units is an important factor that determines the capacity of a network model. By adjusting the number of functional units, the capacity of a model can better match the complexity of a dataset.

Keywords: CNN, convolutional neural network, capsule network, capacity optimization, character recognition, data augmentation, semantic segmentation

Procedia PDF Downloads 120
4495 Nature as a Human Health Asset: An Extensive Review

Authors: C. Sancho Salvatierra, J. M. Martinez Nieto, R. García Gonzalez-Gordon, M. I. Martinez Bellido

Abstract:

Introduction: Nature could act as an asset for human health protecting against possible diseases and promoting the state of both physical and mental health. Goals: This paper aims to determine which natural elements present evidence that show positive influence on human health, on which particular aspects and how. It also aims to determine the best biomarkers to measure such influence. Method: A systematic literature review was carried out. First, a general free text search was performed in databases, such as Scopus, PubMed or PsychInfo. Secondly, a specific search was performed combining keywords in order of increasing complexity. Also the Snowballing technique was used and it was consulted in the CSIC’s (The Spanish National Research Council). Databases: Of the 130 articles obtained and reviewed, 80 referred to natural elements that influenced health. These 80 articles were classified and tabulated according to the nature elements found, the health aspects studied, the health measurement parameters used and the measurement techniques used. In this classification the results of the studies were codified according to whether they were positive, negative or neutral both for the elements of nature and for the aspects of health studied. Finally, the results of the 80 selected studies were summarized and categorized according to the elements of nature that showed the greatest positive influence on health and the biomarkers that had shown greater reliability to measure said influence. Results: Of the 80 articles studied, 24 (30.0%) were reviews and 56 (70.0%) were original research articles. Among the 24 reviews, 18 (75%) found positive results of natural elements on health, and 6 (25%) both positive and negative effects. Of the 56 original articles, 47 (83.9%) showed positive results, 3 (5.4%) both positive and negative, 4 (7.1%) negative effects, and 2 (3.6%) found no effects. The results reflect positive effects of different elements of nature on the following pathologies: diabetes, high blood pressure, stress, attention deficit hyperactivity disorder, psychotic, anxiety and affective disorders. They also show positive effects on the following areas: immune system, social interaction, recovery after illness, mood, decreased aggressiveness, concentrated attention, cognitive performance, restful sleep, vitality and sense of well-being. Among the elements of nature studied, those that show the greatest positive influence on health are forest immersion, natural views, daylight, outdoor physical activity, active transport, vegetation biodiversity, natural sounds and the green residences. As for the biomarkers used that show greater reliability to measure the effects of natural elements are the levels of cortisol (both in blood and saliva), vitamin D levels, serotonin and melatonin, blood pressure, heart rate, muscle tension and skin conductance. Conclusions: Nature is an asset for health, well-being and quality of life. Awareness programs, education and health promotion are needed based on the elements that nature brings us, which in turn generate proactive attitudes in the population towards the protection and conservation of nature. The studies related to this subject in Spain are very scarce. Aknowledgements. This study has been promoted and partially financed by the Environmental Foundation Jaime González-Gordon.

Keywords: health, green areas, nature, well-being

Procedia PDF Downloads 240
4494 Measurements of Service Quality vs Customer Satisfaction in Government Owned Retail Store at Kochi

Authors: N. S. Ajisha

Abstract:

In today’s competitive world the quality of the service you deliver is one of the important factor that determine customer satisfaction. Service quality is considered to be one important determinant to evaluate customer satisfaction and the relationship between service quality and customer satisfaction is considered as the foundation in researches on customer satisfaction. This research examines to do a gap analysis between the perception and expectation of the services delivered and find relation between the service quality and customer satisfaction. Service quality is found out here using the SERVQUAL model. And it finds out the dimension of service quality which is more important to measure customer satisfaction. The dimensions which we measure using SERVQUAL include the tangibles, reliability, responsiveness, assurance, and empathy. This study involves primary data collection like market survey.

Keywords: customer satisfaction, service quality, retail service quality, Kochi

Procedia PDF Downloads 513
4493 Assessment of Ultra-High Cycle Fatigue Behavior of EN-GJL-250 Cast Iron Using Ultrasonic Fatigue Testing Machine

Authors: Saeedeh Bakhtiari, Johannes Depessemier, Stijn Hertelé, Wim De Waele

Abstract:

High cycle fatigue comprising up to 107 load cycles has been the subject of many studies, and the behavior of many materials was recorded adequately in this regime. However, many applications involve larger numbers of load cycles during the lifetime of machine components. In this ultra-high cycle regime, other failure mechanisms play, and the concept of a fatigue endurance limit (assumed for materials such as steel) is often an oversimplification of reality. When machine component design demands a high geometrical complexity, cast iron grades become interesting candidate materials. Grey cast iron is known for its low cost, high compressive strength, and good damping properties. However, the ultra-high cycle fatigue behavior of cast iron is poorly documented. The current work focuses on the ultra-high cycle fatigue behavior of EN-GJL-250 (GG25) grey cast iron by developing an ultrasonic (20 kHz) fatigue testing system. Moreover, the testing machine is instrumented to measure the temperature and the displacement of  the specimen, and to control the temperature. The high resonance frequency allowed to assess the  behavior of the cast iron of interest within a matter of days for ultra-high numbers of cycles, and repeat the tests to quantify the natural scatter in fatigue resistance.

Keywords: GG25, cast iron, ultra-high cycle fatigue, ultrasonic test

Procedia PDF Downloads 133
4492 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes

Authors: Igor A. Krichtafovitch

Abstract:

The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.

Keywords: supercomputer, biological evolution, Darwinism, speciation

Procedia PDF Downloads 128
4491 Decision Making Approach through Generalized Fuzzy Entropy Measure

Authors: H. D. Arora, Anjali Dhiman

Abstract:

Uncertainty is found everywhere and its understanding is central to decision making. Uncertainty emerges as one has less information than the total information required describing a system and its environment. Uncertainty and information are so closely associated that the information provided by an experiment for example, is equal to the amount of uncertainty removed. It may be pertinent to point out that uncertainty manifests itself in several forms and various kinds of uncertainties may arise from random fluctuations, incomplete information, imprecise perception, vagueness etc. For instance, one encounters uncertainty due to vagueness in communication through natural language. Uncertainty in this sense is represented by fuzziness resulting from imprecision of meaning of a concept expressed by linguistic terms. Fuzzy set concept provides an appropriate mathematical framework for dealing with the vagueness. Both information theory, proposed by Shannon (1948) and fuzzy set theory given by Zadeh (1965) plays an important role in human intelligence and various practical problems such as image segmentation, medical diagnosis etc. Numerous approaches and theories dealing with inaccuracy and uncertainty have been proposed by different researcher. In the present communication, we generalize fuzzy entropy proposed by De Luca and Termini (1972) corresponding to Shannon entropy(1948). Further, some of the basic properties of the proposed measure were examined. We also applied the proposed measure to the real life decision making problem.

Keywords: entropy, fuzzy sets, fuzzy entropy, generalized fuzzy entropy, decision making

Procedia PDF Downloads 413