Search results for: complexity analysis
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27764

Search results for: complexity analysis

27704 Five Pitfalls in Defining a Health System and Implications for Research and Management

Authors: Macdonald Kanyangale, Sandram Naluso

Abstract:

Globally, researchers have struggled over time to adequately define the notion of health system to inform research. This study is significant because it proposes an integrative framework for a robust definition of the health system. The objective of this article is to examine major pitfalls in definitions of health system used in prior literature and implications of these for research and management. The study used methodological steps of a scoping review proposed by Arksey and O'Malley to identify and examine 24 definitions of a health system in articles selected from six databases and web search engines. Thematic analysis was used to delineate and categorise definitional pitfalls into broader themes. There are a plethora of five major pitfalls in the extant definitions of a health system which may easily scupper any unsuspecting researcher if not avoided or addressed in research. These definitional pitfalls are reductionist assumptions which ignore dynamic and complex connections, overly wide boundary and lack of specification of levels in a health system, and limited focus on process in a health system. In addition, there is the tendency of treating different components of the health system as equal and simplifying of the ontological complexity of the health system. Future scholars are advised to avoid or address the identified five major pitfalls if they are to develop robust definitions of an HS. The use of an integrative framework for a robust definition of a health system is recommended, while implications of the pitfalls are discussed as a basis and catalyst for complexity-informed research and managing interactively.

Keywords: complexity management, health system, pitfalls, reductionism, research

Procedia PDF Downloads 102
27703 Multiscale Entropy Analysis of Electroencephalogram (EEG) of Alcoholic and Control Subjects

Authors: Lal Hussain, Wajid Aziz, Imtiaz Ahmed Awan, Sharjeel Saeed

Abstract:

Multiscale entropy analysis (MSE) is a useful technique recently developed to quantify the dynamics of physiological signals at different time scales. This study is aimed at investigating the electroencephalogram (EEG) signals to analyze the background activity of alcoholic and control subjects by inspecting various coarse-grained sequences formed at different time scales. EEG recordings of alcoholic and control subjects were taken from the publically available machine learning repository of University of California (UCI) acquired using 64 electrodes. The MSE analysis was performed on the EEG data acquired from all the electrodes of alcoholic and control subjects. Mann-Whitney rank test was used to find significant differences between the groups and result were considered statistically significant for p-values<0.05. The area under receiver operator curve was computed to find the degree separation between the groups. The mean ranks of MSE values at all the times scales for all electrodes were higher control subject as compared to alcoholic subjects. Higher mean ranks represent higher complexity and vice versa. The finding indicated that EEG signals acquired through electrodes C3, C4, F3, F7, F8, O1, O2, P3, T7 showed significant differences between alcoholic and control subjects at time scales 1 to 5. Moreover, all electrodes exhibit significance level at different time scales. Likewise, the highest accuracy and separation was obtained at the central region (C3 and C4), front polar regions (P3, O1, F3, F7, F8 and T8) while other electrodes such asFp1, Fp2, P4 and F4 shows no significant results.

Keywords: electroencephalogram (EEG), multiscale sample entropy (MSE), Mann-Whitney test (MMT), Receiver Operator Curve (ROC), complexity analysis

Procedia PDF Downloads 350
27702 Difficulty and Complexity in Dealing with Visual Pollution in the Historical Cities: The Historical City of Ibb-Yemen as a Case Study

Authors: Abdulfattah A. Q .Alwah, Wen Li, Mohammed A. Q. Alwah, Duc Thien Tran, Bing Xi Liu

Abstract:

The historical cities in the third world suffer from many environmental problems; one of them is the spread of visual pollution manifestations. These phenomena increase with low levels of public awareness and low per capita income. The historical city of Ibb is suffering from a variety of visual pollution of the urban environment, so it has been chosen as a case study. This study aims to identify the difficulty and complexity of dealing with visual pollutions manifestations in the historical city of Ibb, and to provide appropriate solutions, which suit with the complex and contradictory circumstances. The study relies on an inductive approach to achieve its aims through two methods; the first is a visual survey of the visual pollution phenomenon based on images and researcher notes. The Second method is the analyses of the opinions and impressions of the city's residents and visitors through interviews, in addition to interviews with the officials in the competent authorities, and some specialists in the field of urban environment. Through the results of the field study and discussion of the interview results, this study presents an analysis of the phenomenon of visual distortion of the historical city of Ibb regarding the appearances and the reasons. Furthermore, this study provides appropriate solutions, which suitable with the complex and contradictory circumstances. These solutions take two paths: the first one is to stop the spread of visual distortions, and the second path is to address the current visual pollutions.

Keywords: visual pollution, visual image, urban environment, difficulty, complexity, historical cities, the historical city of Ibb

Procedia PDF Downloads 111
27701 Tumor Detection of Cerebral MRI by Multifractal Analysis

Authors: S. Oudjemia, F. Alim, S. Seddiki

Abstract:

This paper shows the application of multifractal analysis for additional help in cancer diagnosis. The medical image processing is a very important discipline in which many existing methods are in search of solutions to real problems of medicine. In this work, we present results of multifractal analysis of brain MRI images. The purpose of this analysis was to separate between healthy and cancerous tissue of the brain. A nonlinear method based on multifractal detrending moving average (MFDMA) which is a generalization of the detrending fluctuations analysis (DFA) is used for the detection of abnormalities in these images. The proposed method could make separation of the two types of brain tissue with success. It is very important to note that the choice of this non-linear method is due to the complexity and irregularity of tumor tissue that linear and classical nonlinear methods seem difficult to characterize completely. In order to show the performance of this method, we compared its results with those of the conventional method box-counting.

Keywords: irregularity, nonlinearity, MRI brain images, multifractal analysis, brain tumor

Procedia PDF Downloads 418
27700 Statistical Analysis of Failure Cases in Aerospace

Authors: J. H. Lv, W. Z. Wang, S.W. Liu

Abstract:

The major concern in the aviation industry is the flight safety. Although great effort has been put onto the development of material and system reliability, the failure cases of fatal accidents still occur nowadays. Due to the complexity of the aviation system, and the interaction among the failure components, the failure analysis of the related equipment is a little difficult. This study focuses on surveying the failure cases in aviation, which are extracted from failure analysis journals, including Engineering Failure Analysis and Case studies in Engineering Failure Analysis, in order to obtain the failure sensitive factors or failure sensitive parts. The analytical results show that, among the failure cases, fatigue failure is the largest in number of occurrence. The most failed components are the disk, blade, landing gear, bearing, and fastener. The frequently failed materials consist of steel, aluminum alloy, superalloy, and titanium alloy. Therefore, in order to assure the safety in aviation, more attention should be paid to the fatigue failures.

Keywords: aerospace, disk, failure analysis, fatigue

Procedia PDF Downloads 296
27699 Explaining Irregularity in Music by Entropy and Information Content

Authors: Lorena Mihelac, Janez Povh

Abstract:

In 2017, we conducted a research study using data consisting of 160 musical excerpts from different musical styles, to analyze the impact of entropy of the harmony on the acceptability of music. In measuring the entropy of harmony, we were interested in unigrams (individual chords in the harmonic progression) and bigrams (the connection of two adjacent chords). In this study, it has been found that 53 musical excerpts out from 160 were evaluated by participants as very complex, although the entropy of the harmonic progression (unigrams and bigrams) was calculated as low. We have explained this by particularities of chord progression, which impact the listener's feeling of complexity and acceptability. We have evaluated the same data twice with new participants in 2018 and with the same participants for the third time in 2019. These three evaluations have shown that the same 53 musical excerpts, found to be difficult and complex in the study conducted in 2017, are exhibiting a high feeling of complexity again. It was proposed that the content of these musical excerpts, defined as “irregular,” is not meeting the listener's expectancy and the basic perceptual principles, creating a higher feeling of difficulty and complexity. As the “irregularities” in these 53 musical excerpts seem to be perceived by the participants without being aware of it, affecting the pleasantness and the feeling of complexity, they have been defined as “subliminal irregularities” and the 53 musical excerpts as “irregular.” In our recent study (2019) of the same data (used in previous research works), we have proposed a new measure of the complexity of harmony, “regularity,” based on the irregularities in the harmonic progression and other plausible particularities in the musical structure found in previous studies. We have in this study also proposed a list of 10 different particularities for which we were assuming that they are impacting the participant’s perception of complexity in harmony. These ten particularities have been tested in this paper, by extending the analysis in our 53 irregular musical excerpts from harmony to melody. In the examining of melody, we have used the computational model “Information Dynamics of Music” (IDyOM) and two information-theoretic measures: entropy - the uncertainty of the prediction before the next event is heard, and information content - the unexpectedness of an event in a sequence. In order to describe the features of melody in these musical examples, we have used four different viewpoints: pitch, interval, duration, scale degree. The results have shown that the texture of melody (e.g., multiple voices, homorhythmic structure) and structure of melody (e.g., huge interval leaps, syncopated rhythm, implied harmony in compound melodies) in these musical excerpts are impacting the participant’s perception of complexity. High information content values were found in compound melodies in which implied harmonies seem to have suggested additional harmonies, affecting the participant’s perception of the chord progression in harmony by creating a sense of an ambiguous musical structure.

Keywords: entropy and information content, harmony, subliminal (ir)regularity, IDyOM

Procedia PDF Downloads 101
27698 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity

Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish

Abstract:

Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.

Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow

Procedia PDF Downloads 107
27697 Improving Student Programming Skills in Introductory Computer and Data Science Courses Using Generative AI

Authors: Genady Grabarnik, Serge Yaskolko

Abstract:

Generative Artificial Intelligence (AI) has significantly expanded its applicability with the incorporation of Large Language Models (LLMs) and become a technology with promise to automate some areas that were very difficult to automate before. The paper describes the introduction of generative Artificial Intelligence into Introductory Computer and Data Science courses and analysis of effect of such introduction. The generative Artificial Intelligence is incorporated in the educational process two-fold: For the instructors, we create templates of prompts for generation of tasks, and grading of the students work, including feedback on the submitted assignments. For the students, we introduce them to basic prompt engineering, which in turn will be used for generation of test cases based on description of the problems, generating code snippets for the single block complexity programming, and partitioning into such blocks of an average size complexity programming. The above-mentioned classes are run using Large Language Models, and feedback from instructors and students and courses’ outcomes are collected. The analysis shows statistically significant positive effect and preference of both stakeholders.

Keywords: introductory computer and data science education, generative AI, large language models, application of LLMS to computer and data science education

Procedia PDF Downloads 27
27696 Visual Analytics in K 12 Education: Emerging Dimensions of Complexity

Authors: Linnea Stenliden

Abstract:

The aim of this paper is to understand emerging learning conditions, when a visual analytics is implemented and used in K 12 (education). To date, little attention has been paid to the role visual analytics (digital media and technology that highlight visual data communication in order to support analytical tasks) can play in education, and to the extent to which these tools can process actionable data for young students. This study was conducted in three public K 12 schools, in four social science classes with students aged 10 to 13 years, over a period of two to four weeks at each school. Empirical data were generated using video observations and analyzed with help of metaphors by Latour. The learning conditions are found to be distinguished by broad complexity characterized by four dimensions. These emerge from the actors’ deeply intertwined relations in the activities. The paper argues in relation to the found dimensions that novel approaches to teaching and learning could benefit students’ knowledge building as they work with visual analytics, analyzing visualized data.

Keywords: analytical reasoning, complexity, data use, problem space, visual analytics, visual storytelling, translation

Procedia PDF Downloads 330
27695 Designing Presentational Writing Assessments for the Advanced Placement World Language and Culture Exams

Authors: Mette Pedersen

Abstract:

This paper outlines the criteria that assessment specialists use when they design the 'Persuasive Essay' task for the four Advanced Placement World Language and Culture Exams (AP French, German, Italian, and Spanish). The 'Persuasive Essay' is a free-response, source-based, standardized measure of presentational writing. Each 'Persuasive Essay' item consists of three sources (an article, a chart, and an audio) and a prompt, which is a statement of the topic phrased as an interrogative sentence. Due to its richness of source materials and due to the amount of time that test takers are given to prepare for and write their responses (a total of 55 minutes), the 'Persuasive Essay' is the free-response task on the AP World Language and Culture Exams that goes to the greatest lengths to unleash the test takers' proficiency potential. The author focuses on the work that goes into designing the 'Persuasive Essay' task, outlining best practices for the selection of topics and sources, the interplay that needs to be present among the sources and the thinking behind the articulation of prompts for the 'Persuasive Essay' task. Using released 'Persuasive Essay' items from the AP World Language and Culture Exams and accompanying data on test taker performance, the author shows how different passages, and features of passages, have succeeded (and sometimes not succeeded) in eliciting writing proficiency among test takers over time. Data from approximately 215.000 test takers per year from 2014 to 2017 and approximately 35.000 test takers per year from 2012 to 2013 form the basis of this analysis. The conclusion of the study is that test taker performance improves significantly when the sources that test takers are presented with express directly opposing viewpoints. Test taker performance also improves when the interrogative prompt that the test takers respond to is phrased as a yes/no question. Finally, an analysis of linguistic difficulty and complexity levels of the printed sources reveals that test taker performance does not decrease when the complexity level of the article of the 'Persuasive Essay' increases. This last text complexity analysis is performed with the help of the 'ETS TextEvaluator' tool and the 'Complexity Scale for Information Texts (Scale)', two tools, which, in combination, provide a rubric and a fully-automated technology for evaluating nonfiction and informational texts in English translation.

Keywords: advanced placement world language and culture exams, designing presentational writing assessments, large-scale standardized assessments of written language proficiency, source-based language testing

Procedia PDF Downloads 105
27694 A Phenomenological Study of Sports for the Analysis of Soccer Game: On Embodiment of the Goal Type Ball Games of Team Sports

Authors: K. Kiniwa, S. Kitagawa, M. Kawamoto, H. Uchiyama

Abstract:

This study aims to identify phenomenologically the embodiment of soccer in order to analyze soccer games. In this paper the authors focused on the embodiment of sports and the embodiment of the goal type ball games of team sports. The authors revealed that the embodiment of sports is represented by inverse proportional body. It is possible for this structure (body scheme) of intercorporeality of sports to be compared to the symbolic figure of Uroboros which is a monster connected the tails of two snakes. The embodiment of the goal type ball games of team sports has dependency on situation and complexity. In doing this, it revealed that soccer is sensitive and emotional sports.

Keywords: intercorporeality, structure, body scheme, Uroboros, inverse proportional body, dependency on situation, complexity

Procedia PDF Downloads 259
27693 Analysis of Direct Current Motor in LabVIEW

Authors: E. Ramprasath, P. Manojkumar, P. Veena

Abstract:

DC motors have been widely used in the past centuries which are proudly known as the workhorse of industrial systems until the invention of the AC induction motors which makes a huge revolution in industries. Since then, the use of DC machines have been decreased due to enormous factors such as reliability, robustness and complexity but it lost its fame due to the losses. A new methodology is proposed to construct a DC motor through the simulation in LabVIEW to get an idea about its real time performances, if a change in parameter might have bigger improvement in losses and reliability.

Keywords: analysis, characteristics, direct current motor, LabVIEW software, simulation

Procedia PDF Downloads 517
27692 An Approach for Multilayered Ecological Networks

Authors: N. F. F. Ebecken, G. C. Pereira

Abstract:

Although networks provide a powerful approach to the study of a wide variety of ecological systems, their formulation usually does not include various types of interactions, interactions that vary in space and time, and interconnected systems such as networks. The emerging field of 'multilayer networks' provides a natural framework for extending ecological systems analysis to include these multiple layers of complexity as it specifically allows for differentiation and modeling of intralayer and interlayer connectivity. The structure provides a set of concepts and tools that can be adapted and applied to the ecology, facilitating research in high dimensionality, heterogeneous systems in nature. Here, ecological multilayer networks are formally defined based on a review of prior and related approaches, illustrates their application and potential with existing data analyzes, and discusses limitations, challenges, and future applications. The integration of multilayer network theory into ecology offers a largely untapped potential to further address ecological complexity, to finally provide new theoretical and empirical insights into the architecture and dynamics of ecological systems.

Keywords: ecological networks, multilayered networks, sea ecology, Brazilian Coastal Area

Procedia PDF Downloads 116
27691 Reliability of Self-Reported Language Proficiency Measures in l1 Attrition Research: A Closer Look at the Can-Do-Scales.

Authors: Anastasia Sorokina

Abstract:

Self-reported language proficiency measures have been widely used by researchers and have been proven to be an accurate tool to assess actual language proficiency. L1 attrition researchers also rely on self-reported measures. More specifically, can-do-scales has gained popularity in the discipline of L1 attrition research. The can-do-scales usually contain statements about language (e.g., “I can write e-mails”); participants are asked to rate each statement on a scale from 1 (I cannot do it at all) to 5 (I can do it without any difficulties). Despite its popularity, no studies have examined can-do-scales’ reliability at measuring the actual level of L1 attrition. Do can-do-scales positively correlate with lexical diversity, syntactic complexity, and fluency? The present study analyzed speech samples of 35 Russian-English attriters to examine whether their self-reported proficiency correlates with their actual L1 proficiency. The results of Pearson correlation demonstrated that can-do-scales correlated with lexical diversity, syntactic complexity, and fluency. These findings provide a valuable contribution to the L1 attrition research by demonstrating that can-do-scales can be used as a reliable tool to measure L1 attrition.

Keywords: L1 attrition, can-do-scales, lexical diversity, syntactic complexity

Procedia PDF Downloads 196
27690 A Configurational Approach to Understand the Effect of Organizational Structure on Absorptive Capacity: Results from PLS and fsQCA

Authors: Murad Ali, Anderson Konan Seny Kan, Khalid A. Maimani

Abstract:

Based on the theory of organizational design and the theory of knowledge, this study uses complexity theory to explain and better understand the causal impacts of various patterns of organizational structural factors stimulating absorptive capacity (ACAP). Organizational structure can be thought of as heterogeneous configurations where various components are often intertwined. This study argues that impact of the traditional variables which define a firm’s organizational structure (centralization, formalization, complexity and integration) on ACAP is better understood in terms of set-theoretic relations rather than correlations. This study uses a data sample of 347 from a multiple industrial sector in South Korea. The results from PLS-SEM support all the hypothetical relationships among the variables. However, fsQCA results suggest the possible configurations of centralization, formalization, complexity, integration, age, size, industry and revenue factors that contribute to high level of ACAP. The results from fsQCA demonstrate the usefulness of configurational approaches in helping understand equifinality in the field of knowledge management. A recent fsQCA procedure based on a modeling subsample and holdout subsample is use in this study to assess the predictive validity of the model under investigation. The same type predictive analysis is also made through PLS-SEM. These analyses reveal a good relevance of causal solutions leading to high level of ACAP. In overall, the results obtained from combining PLS-SEM and fsQCA are very insightful. In particular, they could help managers to link internal organizational structural with ACAP. In other words, managers may comprehend finely how different components of organizational structure can increase the level of ACAP. The configurational approach may trigger new insights that could help managers prioritize selection criteria and understand the interactions between organizational structure and ACAP. The paper also discusses theoretical and managerial implications arising from these findings.

Keywords: absorptive capacity, organizational structure, PLS-SEM, fsQCA, predictive analysis, modeling subsample, holdout subsample

Procedia PDF Downloads 300
27689 A Less Complexity Deep Learning Method for Drones Detection

Authors: Mohamad Kassab, Amal El Fallah Seghrouchni, Frederic Barbaresco, Raed Abu Zitar

Abstract:

Detecting objects such as drones is a challenging task as their relative size and maneuvering capabilities deceive machine learning models and cause them to misclassify drones as birds or other objects. In this work, we investigate applying several deep learning techniques to benchmark real data sets of flying drones. A deep learning paradigm is proposed for the purpose of mitigating the complexity of those systems. The proposed paradigm consists of a hybrid between the AdderNet deep learning paradigm and the Single Shot Detector (SSD) paradigm. The goal was to minimize multiplication operations numbers in the filtering layers within the proposed system and, hence, reduce complexity. Some standard machine learning technique, such as SVM, is also tested and compared to other deep learning systems. The data sets used for training and testing were either complete or filtered in order to remove the images with mall objects. The types of data were RGB or IR data. Comparisons were made between all these types, and conclusions were presented.

Keywords: drones detection, deep learning, birds versus drones, precision of detection, AdderNet

Procedia PDF Downloads 143
27688 Assessment of the Validity of Sentiment Analysis as a Tool to Analyze the Emotional Content of Text

Authors: Trisha Malhotra

Abstract:

Sentiment analysis is a recent field of study that computationally assesses the emotional nature of a body of text. To assess its test-validity, sentiment analysis was carried out on the emotional corpus of text from a personal 15-day mood diary. Self-reported mood scores varied more or less accurately with daily mood evaluation score given by the software. On further assessment, it was found that while sentiment analysis was good at assessing ‘global’ mood, it was not able to ‘locally’ identify and differentially score synonyms of various emotional words. It is further critiqued for treating the intensity of an emotion as universal across cultures. Finally, the software is shown not to account for emotional complexity in sentences by treating emotions as strictly positive or negative. Hence, it is posited that a better output could be two (positive and negative) affect scores for the same body of text.

Keywords: analysis, data, diary, emotions, mood, sentiment

Procedia PDF Downloads 236
27687 Towards a Simulation Model to Ensure the Availability of Machines in Maintenance Activities

Authors: Maryam Gallab, Hafida Bouloiz, Youness Chater, Mohamed Tkiouat

Abstract:

The aim of this paper is to present a model based on multi-agent systems in order to manage the maintenance activities and to ensure the reliability and availability of machines just with the required resources (operators, tools). The interest of the simulation is to solve the complexity of the system and to find results without cost or wasting time. An implementation of the model is carried out on the AnyLogic platform to display the defined performance indicators.

Keywords: maintenance, complexity, simulation, multi-agent systems, AnyLogic platform

Procedia PDF Downloads 275
27686 A Time-Reducible Approach to Compute Determinant |I-X|

Authors: Wang Xingbo

Abstract:

Computation of determinant in the form |I-X| is primary and fundamental because it can help to compute many other determinants. This article puts forward a time-reducible approach to compute determinant |I-X|. The approach is derived from the Newton’s identity and its time complexity is no more than that to compute the eigenvalues of the square matrix X. Mathematical deductions and numerical example are presented in detail for the approach. By comparison with classical approaches the new approach is proved to be superior to the classical ones and it can naturally reduce the computational time with the improvement of efficiency to compute eigenvalues of the square matrix.

Keywords: algorithm, determinant, computation, eigenvalue, time complexity

Procedia PDF Downloads 385
27685 The Revenue Management Implementation and Its Complexity in the Airline Industry: An Empirical Study on the Egyptian Airline Industry

Authors: Amr Sultan, Sara Elgazzar, Breksal Elmiligy

Abstract:

The airline industry nowadays is becoming a more growing industry facing a severe competition. It is an influential issue in this context to utilize revenue management (RM) concept and practice in order to develop the pricing strategy. There is an unfathomable necessity for RM to assist the airlines and their associates to disparage the cost and recuperate their revenue, which in turn will boost the airline industry performance. The complexity of RM imposes enormous challenges on the airline industry. Several studies have been proposed on the RM adaptation in airlines industry while there is a limited availability of implementing RM and its complexity in the developing countries such as Egypt. This research represents a research schema about the implementation of the RM to the Egyptian airline industry. The research aims at investigating and demonstrating the complexities face implementing RM in the airline industry, up on which the research provides a comprehensive understanding of how to overcome these complexities while adapting RM in the Egyptian airline industry. An empirical study was conducted on the Egyptian airline sector based on a sample of four airlines (Egyptair, Britishair, KLM, and Lufthansa). The empirical study was conducted using a mix of qualitative and quantitative approaches. First, in-depth interviews were carried out to analyze the Egyptian airline sector status and the main challenges faced by the airlines. Then, a structured survey on the three different parties of airline industry; airlines, airfreight forwarders, and passengers were conducted in order to investigate the main complexity factors from different parties' points of view. Finally, a focus group was conducted to develop a best practice framework to overcome the complexities faced the RM adaptation in the Egyptian airline industry. The research provides an original contribution to knowledge by creating a framework to overcome the complexities and challenges in adapting RM in the airline industry generally and the Egyptian airline industry particularly. The framework can be used as a RM tool to increase the effectiveness and efficiency of the Egyptian airline industry performance.

Keywords: revenue management, airline industry, revenue management complexity, Egyptian airline industry

Procedia PDF Downloads 357
27684 Energy Absorption Capacity of Aluminium Foam Manufactured by Kelvin Model Loaded Under Different Biaxial Combined Compression-Torsion Conditions

Authors: H. Solomon, A. Abdul-Latif, R. Baleh, I. Deiab, K. Khanafer

Abstract:

Aluminum foams were developed and tested due to their high energy absorption abilities for multifunctional applications. The aim of this research work was to investigate experimentally the effect of quasi-static biaxial loading complexity (combined compression-torsion) on the energy absorption capacity of highly uniform architecture open-cell aluminum foam manufactured by kelvin cell model. The two generated aluminum foams have 80% and 85% porosities, spherical-shaped pores having 11mm in diameter. These foams were tested by means of several square-section specimens. A patented rig called ACTP (Absorption par Compression-Torsion Plastique), was used to investigate the foam response under quasi-static complex loading paths having different torsional components (i.e., 0°, 37° and 53°). The main mechanical responses of the aluminum foams were studied under simple, intermediate and severe loading conditions. In fact, the key responses to be examined were stress plateau and energy absorption capacity of the two foams with respect to loading complexity. It was concluded that the higher the loading complexity and the higher the relative density, the greater the energy absorption capacity of the foam. The highest energy absorption was thus recorded under the most complicated loading path (i.e., biaxial-53°) for the denser foam (i.e., 80% porosity).

Keywords: open-cell aluminum foams, biaxial loading complexity, foams porosity, energy absorption capacity, characterization

Procedia PDF Downloads 81
27683 A Gradient Orientation Based Efficient Linear Interpolation Method

Authors: S. Khan, A. Khan, Abdul R. Soomrani, Raja F. Zafar, A. Waqas, G. Akbar

Abstract:

This paper proposes a low-complexity image interpolation method. Image interpolation is used to convert a low dimension video/image to high dimension video/image. The objective of a good interpolation method is to upscale an image in such a way that it provides better edge preservation at the cost of very low complexity so that real-time processing of video frames can be made possible. However, low complexity methods tend to provide real-time interpolation at the cost of blurring, jagging and other artifacts due to errors in slope calculation. Non-linear methods, on the other hand, provide better edge preservation, but at the cost of high complexity and hence they can be considered very far from having real-time interpolation. The proposed method is a linear method that uses gradient orientation for slope calculation, unlike conventional linear methods that uses the contrast of nearby pixels. Prewitt edge detection is applied to separate uniform regions and edges. Simple line averaging is applied to unknown uniform regions, whereas unknown edge pixels are interpolated after calculation of slopes using gradient orientations of neighboring known edge pixels. As a post-processing step, bilateral filter is applied to interpolated edge regions in order to enhance the interpolated edges.

Keywords: edge detection, gradient orientation, image upscaling, linear interpolation, slope tracing

Procedia PDF Downloads 228
27682 Using Analytical Hierarchy Process and TOPSIS Approaches in Designing a Finite Element Analysis Automation Program

Authors: Ming Wen, Nasim Nezamoddini

Abstract:

Sophisticated numerical simulations like finite element analysis (FEA) involve a complicated process from model setup to post-processing tasks that require replication of time-consuming steps. Utilizing FEA automation program simplifies the complexity of the involved steps while minimizing human errors in analysis set up, calculations, and results processing. One of the main challenges in designing FEA automation programs is to identify user requirements and link them to possible design alternatives. This paper presents a decision-making framework to design a Python based FEA automation program for modal analysis, frequency response analysis, and random vibration fatigue (RVF) analysis procedures. Analytical hierarchy process (AHP) and technique for order preference by similarity to ideal solution (TOPSIS) are applied to evaluate design alternatives considering the feedback received from experts and program users.

Keywords: finite element analysis, FEA, random vibration fatigue, process automation, analytical hierarchy process, AHP, TOPSIS, multiple-criteria decision-making, MCDM

Procedia PDF Downloads 82
27681 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.

Keywords: integral differential equations, jump–diffusion model, American options, rational approximation

Procedia PDF Downloads 89
27680 Formulation and Test of a Model to explain the Complexity of Road Accident Events in South Africa

Authors: Dimakatso Machetele, Kowiyou Yessoufou

Abstract:

Whilst several studies indicated that road accident events might be more complex than thought, we have a limited scientific understanding of this complexity in South Africa. The present project proposes and tests a more comprehensive metamodel that integrates multiple causality relationships among variables previously linked to road accidents. This was done by fitting a structural equation model (SEM) to the data collected from various sources. The study also fitted the GARCH Model (Generalized Auto-Regressive Conditional Heteroskedasticity) to predict the future of road accidents in the country. The analysis shows that the number of road accidents has been increasing since 1935. The road fatality rate follows a polynomial shape following the equation: y = -0.0114x²+1.2378x-2.2627 (R²=0.76) with y = death rate and x = year. This trend results in an average death rate of 23.14 deaths per 100,000 people. Furthermore, the analysis shows that the number of crashes could be significantly explained by the total number of vehicles (P < 0.001), number of registered vehicles (P < 0.001), number of unregistered vehicles (P = 0.003) and the population of the country (P < 0.001). As opposed to expectation, the number of driver licenses issued and total distance traveled by vehicles do not correlate significantly with the number of crashes (P > 0.05). Furthermore, the analysis reveals that the number of casualties could be linked significantly to the number of registered vehicles (P < 0.001) and total distance traveled by vehicles (P = 0.03). As for the number of fatal crashes, the analysis reveals that the total number of vehicles (P < 0.001), number of registered (P < 0.001) and unregistered vehicles (P < 0.001), the population of the country (P < 0.001) and the total distance traveled by vehicles (P < 0.001) correlate significantly with the number of fatal crashes. However, the number of casualties and again the number of driver licenses do not seem to determine the number of fatal crashes (P > 0.05). Finally, the number of crashes is predicted to be roughly constant overtime at 617,253 accidents for the next 10 years, with the worse scenario suggesting that this number may reach 1 896 667. The number of casualties was also predicted to be roughly constant at 93 531 overtime, although this number may reach 661 531 in the worst-case scenario. However, although the number of fatal crashes may decrease over time, it is forecasted to reach 11 241 fatal crashes within the next 10 years, with the worse scenario estimated at 19 034 within the same period. Finally, the number of fatalities is also predicted to be roughly constant at 14 739 but may also reach 172 784 in the worse scenario. Overall, the present study reveals the complexity of road accidents and allows us to propose several recommendations aimed to reduce the trend of road accidents, casualties, fatal crashes, and death in South Africa.

Keywords: road accidents, South Africa, statistical modelling, trends

Procedia PDF Downloads 128
27679 Accountability of Artificial Intelligence: An Analysis Using Edgar Morin’s Complex Thought

Authors: Sylvie Michel, Sylvie Gerbaix, Marc Bidan

Abstract:

Artificial intelligence (AI) can be held accountable for its detrimental impacts. This question gains heightened relevance given AI's pervasive reach across various domains, magnifying its power and potential. The expanding influence of AI raises fundamental ethical inquiries, primarily centering on biases, responsibility, and transparency. This encompasses discriminatory biases arising from algorithmic criteria or data, accidents attributed to autonomous vehicles or other systems, and the imperative of transparent decision-making. This article aims to stimulate reflection on AI accountability, denoting the necessity to elucidate the effects it generates. Accountability comprises two integral aspects: adherence to legal and ethical standards and the imperative to elucidate the underlying operational rationale. The objective is to initiate a reflection on the obstacles to this "accountability," facing the challenges of the complexity of artificial intelligence's system and its effects. Then, this article proposes to mobilize Edgar Morin's complex thought to encompass and face the challenges of this complexity. The first contribution is to point out the challenges posed by the complexity of A.I., with fractional accountability between a myriad of human and non-human actors, such as software and equipment, which ultimately contribute to the decisions taken and are multiplied in the case of AI. Accountability faces three challenges resulting from the complexity of the ethical issues combined with the complexity of AI. The challenge of the non-neutrality of algorithmic systems as fully ethically non-neutral actors is put forward by a revealing ethics approach that calls for assigning responsibilities to these systems. The challenge of the dilution of responsibility is induced by the multiplicity and distancing between the actors. Thus, a dilution of responsibility is induced by a split in decision-making between developers, who feel they fulfill their duty by strictly respecting the requests they receive, and management, which does not consider itself responsible for technology-related flaws. Accountability is confronted with the challenge of transparency of complex and scalable algorithmic systems, non-human actors self-learning via big data. A second contribution involves leveraging E. Morin's principles, providing a framework to grasp the multifaceted ethical dilemmas and subsequently paving the way for establishing accountability in AI. When addressing the ethical challenge of biases, the "hologrammatic" principle underscores the imperative of acknowledging the non-ethical neutrality of algorithmic systems inherently imbued with the values and biases of their creators and society. The "dialogic" principle advocates for the responsible consideration of ethical dilemmas, encouraging the integration of complementary and contradictory elements in solutions from the very inception of the design phase. Aligning with the principle of organizing recursiveness, akin to the "transparency" of the system, it promotes a systemic analysis to account for the induced effects and guides the incorporation of modifications into the system to rectify deviations and reintroduce modifications into the system to rectify its drifts. In conclusion, this contribution serves as an inception for contemplating the accountability of "artificial intelligence" systems despite the evident ethical implications and potential deviations. Edgar Morin's principles, providing a lens to contemplate this complexity, offer valuable perspectives to address these challenges concerning accountability.

Keywords: accountability, artificial intelligence, complexity, ethics, explainability, transparency, Edgar Morin

Procedia PDF Downloads 31
27678 Lean Implementation Analysis on the Safety Performance of Construction Projects in the Philippines

Authors: Kim Lindsay F. Restua, Jeehan Kyra A. Rivero, Joneka Myles D. Taguba

Abstract:

Lean construction is defined as an approach in construction with the purpose of reducing waste in the process without compromising the value of the project. There are numerous lean construction tools that are applied in the construction process, which maximizes the efficiency of work and satisfaction of customers while minimizing waste. However, the complexity and differences of construction projects cause a rise in challenges on achieving the lean benefits construction can give, such as improvement in safety performance. The objective of this study is to determine the relationship between lean construction tools and their effects on safety performance. The relationship between construction tools applied in construction and safety performance is identified through Logistic Regression Analysis, and Correlation Analysis was conducted thereafter. Based on the findings, it was concluded that almost 60% of the factors listed in the study, which are different tools and effects of lean construction, were determined to have a significant relationship with the level of safety in construction projects.

Keywords: correlation analysis, lean construction tools, lean construction, logistic regression analysis, risk management, safety

Procedia PDF Downloads 143
27677 Low Complexity Carrier Frequency Offset Estimation for Cooperative Orthogonal Frequency Division Multiplexing Communication Systems without Cyclic Prefix

Authors: Tsui-Tsai Lin

Abstract:

Cooperative orthogonal frequency division multiplexing (OFDM) transmission, which possesses the advantages of better connectivity, expanded coverage, and resistance to frequency selective fading, has been a more powerful solution for the physical layer in wireless communications. However, such a hybrid scheme suffers from the carrier frequency offset (CFO) effects inherited from the OFDM-based systems, which lead to a significant degradation in performance. In addition, insertion of a cyclic prefix (CP) at each symbol block head for combating inter-symbol interference will lead to a reduction in spectral efficiency. The design on the CFO estimation for the cooperative OFDM system without CP is a suspended problem. This motivates us to develop a low complexity CFO estimator for the cooperative OFDM decode-and-forward (DF) communication system without CP over the multipath fading channel. Especially, using a block-type pilot, the CFO estimation is first derived in accordance with the least square criterion. A reliable performance can be obtained through an exhaustive two-dimensional (2D) search with a penalty of heavy computational complexity. As a remedy, an alternative solution realized with an iteration approach is proposed for the CFO estimation. In contrast to the 2D-search estimator, the iterative method enjoys the advantage of the substantially reduced implementation complexity without sacrificing the estimate performance. Computer simulations have been presented to demonstrate the efficacy of the proposed CFO estimation.

Keywords: cooperative transmission, orthogonal frequency division multiplexing (OFDM), carrier frequency offset, iteration

Procedia PDF Downloads 238
27676 Parametric Design as an Approach to Respond to Complexity

Authors: Sepideh Jabbari Behnam, Zahrasadat Saide Zarabadi

Abstract:

A city is an intertwined texture from the relationship of different components in a whole which is united in a one, so designing the whole complex and its planning is not an easy matter. By considering that a city is a complex system with infinite components and communications, providing flexible layouts that can respond to the unpredictable character of the city, which is a result of its complexity, is inevitable. Parametric design approach as a new approach can produce flexible and transformative layouts in any stage of design. This study aimed to introduce parametric design as a modern approach to respond to complex urban issues by using descriptive and analytical methods. This paper firstly introduces complex systems and then giving a brief characteristic of complex systems. The flexible design and layout flexibility is another matter in response and simulation of complex urban systems that should be considered in design, which is discussed in this study. In this regard, after describing the nature of the parametric approach as a flexible approach, as well as a tool and appropriate way to respond to features such as limited predictability, reciprocating nature, complex communications, and being sensitive to initial conditions and hierarchy, this paper introduces parametric design.

Keywords: complexity theory, complex system, flexibility, parametric design

Procedia PDF Downloads 333
27675 A Network-Theorical Perspective on Music Analysis

Authors: Alberto Alcalá-Alvarez, Pablo Padilla-Longoria

Abstract:

The present paper describes a framework for constructing mathematical networks encoding relevant musical information from a music score for structural analysis. These graphs englobe statistical information about music elements such as notes, chords, rhythms, intervals, etc., and the relations among them, and so become helpful in visualizing and understanding important stylistic features of a music fragment. In order to build such networks, musical data is parsed out of a digital symbolic music file. This data undergoes different analytical procedures from Graph Theory, such as measuring the centrality of nodes, community detection, and entropy calculation. The resulting networks reflect important structural characteristics of the fragment in question: predominant elements, connectivity between them, and complexity of the information contained in it. Music pieces in different styles are analyzed, and the results are contrasted with the traditional analysis outcome in order to show the consistency and potential utility of this method for music analysis.

Keywords: computational musicology, mathematical music modelling, music analysis, style classification

Procedia PDF Downloads 61