Search results for: soetal complexity
1633 Uncovering the Complex Structure of Building Design Process Based on Royal Institute of British Architects Plan of Work
Authors: Fawaz A. Binsarra, Halim Boussabaine
Abstract:
The notion of complexity science has been attracting the interest of researchers and professionals due to the need of enhancing the efficiency of understanding complex systems dynamic and structure of interactions. In addition, complexity analysis has been used as an approach to investigate complex systems that contains a large number of components interacts with each other to accomplish specific outcomes and emerges specific behavior. The design process is considered as a complex action that involves large number interacted components, which are ranked as design tasks, design team, and the components of the design process. Those three main aspects of the building design process consist of several components that interact with each other as a dynamic system with complex information flow. In this paper, the goal is to uncover the complex structure of information interactions in building design process. The Investigating of Royal Institute of British Architects Plan Of Work 2013 information interactions as a case study to uncover the structure and building design process complexity using network analysis software to model the information interaction will significantly enhance the efficiency of the building design process outcomes.Keywords: complexity, process, building desgin, Riba, design complexity, network, network analysis
Procedia PDF Downloads 5271632 Determination of Complexity Level in Merged Irregular Transposition Cipher
Authors: Okike Benjamin, Garba Ejd
Abstract:
Today, it has been observed security of information along the superhighway is often compromised by those who are not authorized to have access to such information. In order to ensure the security of information along the superhighway, such information should be encrypted by some means to conceal the real meaning of the information. There are many encryption techniques out there in the market. However, some of these encryption techniques are often easily decrypted by adversaries. The researcher has decided to develop an encryption technique that may be more difficult to decrypt. This may be achieved by splitting the message to be encrypted into parts and encrypting each part separately and swapping the positions before transmitting the message along the superhighway. The method is termed Merged Irregular Transposition Cipher. Also, the research would determine the complexity level in respect to the number of splits of the message.Keywords: transposition cipher, merged irregular cipher, encryption, complexity level
Procedia PDF Downloads 3441631 Low-Complexity Multiplication Using Complement and Signed-Digit Recoding Methods
Authors: Te-Jen Chang, I-Hui Pan, Ping-Sheng Huang, Shan-Jen Cheng
Abstract:
In this paper, a fast multiplication computing method utilizing the complement representation method and canonical recoding technique is proposed. By performing complements and canonical recoding technique, the number of partial products can be reduced. Based on these techniques, we propose an algorithm that provides an efficient multiplication method. On average, our proposed algorithm is to reduce the number of k-bit additions from (0.25k+logk/k+2.5) to (k/6 +logk/k+2.5), where k is the bit-length of the multiplicand A and multiplier B. We can therefore efficiently speed up the overall performance of the multiplication. Moreover, if we use the new proposes to compute common-multiplicand multiplication, the computational complexity can be reduced from (0.5 k+2 logk/k+5) to (k/3+2 logk/k+5) k-bit additions.Keywords: algorithm design, complexity analysis, canonical recoding, public key cryptography, common-multiplicand multiplication
Procedia PDF Downloads 4351630 Modal Density Influence on Modal Complexity Quantification in Dynamic Systems
Authors: Fabrizio Iezzi, Claudio Valente
Abstract:
The viscous damping in dynamic systems can be proportional or non-proportional. In the first case, the mode shapes are real whereas in the second case they are complex. From an engineering point of view, the complexity of the mode shapes is important in order to quantify the non-proportional damping. Different indices exist to provide estimates of the modal complexity. These indices are or not zero, depending whether the mode shapes are not or are complex. The modal density problem arises in the experimental identification when the dynamic systems have close modal frequencies. Depending on the entity of this closeness, the mode shapes can hold fictitious imaginary quantities that affect the values of the modal complexity indices. The results are the failing in the identification of the real or complex mode shapes and then of the proportional or non-proportional damping. The paper aims to show the influence of the modal density on the values of these indices in case of both proportional and non-proportional damping. Theoretical and pseudo-experimental solutions are compared to analyze the problem according to an appropriate mechanical system.Keywords: complex mode shapes, dynamic systems identification, modal density, non-proportional damping
Procedia PDF Downloads 3871629 Green Energy, Fiscal Incentives and Conflicting Signals: Analysing the Challenges Faced in Promoting on Farm Waste to Energy Projects
Authors: Hafez Abdo, Rob Ackrill
Abstract:
Renewable energy (RE) promotion in the UK relies on multiple policy instruments, which are required to overcome the path dependency pressures favouring fossil fuels. These instruments include targeted funding schemes and economy-wide instruments embedded in the tax code. The resulting complexity of incentives raises important questions around the coherence and effectiveness of these instruments for RE generation. This complexity is exacerbated by UK RE policy being nested within EU policy in a multi-level governance (MLG) setting. To gain analytical traction on such complexity, this study will analyse policies promoting the on-farm generation of energy for heat and power, from farm and food waste, via anaerobic digestion. Utilising both primary and secondary data, it seeks to address a particular lacuna in the academic literature. Via a localised, in-depth investigation into the complexity of policy instruments promoting RE, this study will help our theoretical understanding of the challenges that MLG and path dependency pressures present to policymakers of multi-dimensional policies.Keywords: anaerobic digestion, energy, green, policy, renewable, tax, UK
Procedia PDF Downloads 3701628 Determination of Complexity Level in Okike's Merged Irregular Transposition Cipher
Authors: Okike Benjami, Garba Ejd
Abstract:
Today, it has been observed security of information along the superhighway is often compromised by those who are not authorized to have access to such information. In other to ensure the security of information along the superhighway, such information should be encrypted by some means to conceal the real meaning of the information. There are many encryption techniques out there in the market. However, some of these encryption techniques are often decrypted by adversaries with ease. The researcher has decided to develop an encryption technique that may be more difficult to decrypt. This may be achieved by splitting the message to be encrypted into parts and encrypting each part separately and swapping the positions before transmitting the message along the superhighway. The method is termed Okike’s Merged Irregular Transposition Cipher. Also, the research would determine the complexity level in respect to the number of splits of the message.Keywords: transposition cipher, merged irregular cipher, encryption, complexity level
Procedia PDF Downloads 2891627 Efficient Signal Detection Using QRD-M Based on Channel Condition in MIMO-OFDM System
Authors: Jae-Jeong Kim, Ki-Ro Kim, Hyoung-Kyu Song
Abstract:
In this paper, we propose an efficient signal detector that switches M parameter of QRD-M detection scheme is proposed for MIMO-OFDM system. The proposed detection scheme calculates the threshold by 1-norm condition number and then switches M parameter of QRD-M detection scheme according to channel information. If channel condition is bad, the parameter M is set to high value to increase the accuracy of detection. If channel condition is good, the parameter M is set to low value to reduce complexity of detection. Therefore, the proposed detection scheme has better trade off between BER performance and complexity than the conventional detection scheme. The simulation result shows that the complexity of proposed detection scheme is lower than QRD-M detection scheme with similar BER performance.Keywords: MIMO-OFDM, QRD-M, channel condition, BER
Procedia PDF Downloads 3701626 Effect of Phonological Complexity in Children with Specific Language Impairment
Authors: Irfana M., Priyandi Kabasi
Abstract:
Children with specific language impairment (SLI) have difficulty acquiring and using language despite having all the requirements of cognitive skills to support language acquisition. These children have normal non-verbal intelligence, hearing, and oral-motor skills, with no history of social/emotional problems or significant neurological impairment. Nevertheless, their language acquisition lags behind their peers. Phonological complexity can be considered to be the major factor that causes the inaccurate production of speech in this population. However, the implementation of various ranges of complex phonological stimuli in the treatment session of SLI should be followed for a better prognosis of speech accuracy. Hence there is a need to study the levels of phonological complexity. The present study consisted of 7 individuals who were diagnosed with SLI and 10 developmentally normal children. All of them were Hindi speakers with both genders and their age ranged from 4 to 5 years. There were 4 sets of stimuli; among them were minimal contrast vs maximal contrast nonwords, minimal coarticulation vs maximal coarticulation nonwords, minimal contrast vs maximal contrast words and minimal coarticulation vs maximal coarticulation words. Each set contained 10 stimuli and participants were asked to repeat each stimulus. Results showed that production of maximal contrast was significantly accurate, followed by minimal coarticulation, minimal contrast and maximal coarticulation. A similar trend was shown for both word and non-word categories of stimuli. The phonological complexity effect was evident in the study for each participant group. Moreover, present study findings can be implemented for the management of SLI, specifically for the selection of stimuli.Keywords: coarticulation, minimal contrast, phonological complexity, specific language impairment
Procedia PDF Downloads 1421625 Theoretical Paradigms for Total Quality Environmental Management (TQEM)
Authors: Mohammad Hossein Khasmafkan Nezam, Nader Chavoshi Boroujeni, Mohamad Reza Veshaghi
Abstract:
Quality management is dominated by rational paradigms for the measurement and management of quality, but these paradigms start to ‘break down’, when faced with the inherent complexity of managing quality in intensely competitive changing environments. In this article, the various theoretical paradigms employed to manage quality are reviewed and the advantages and limitations of these paradigms are highlighted. A major implication of this review is that when faced with complexity, an ideological stance to any single strategy paradigm for total quality environmental management is ineffective. We suggest that as complexity increases and we envisage intensely competitive changing environments there will be a greater need to consider a multi-paradigm integrationist view of strategy for TQEM.Keywords: total quality management (TQM), total quality environmental management (TQEM), ideologies (philosophy), theoretical paradigms
Procedia PDF Downloads 3171624 Exploring Leadership Adaptability in the Private Healthcare Organizations in the UK in Times of Crises
Authors: Sade Ogundipe
Abstract:
The private healthcare sector in the United Kingdom has experienced unprecedented challenges during times of crisis, necessitating effective leadership adaptability. This qualitative study delves into the dynamic landscape of leadership within the sector, particularly during crises, employing the lenses of complexity theory and institutional theory to unravel the intricate mechanisms at play. Through in-depth interviews with 25 various levels of leaders in the UK private healthcare sector, this research explores how leaders in UK private healthcare organizations navigate complex and often chaotic environments, shedding light on their adaptive strategies and decision-making processes during crises. Complexity theory is used to analyze the complicated, volatile nature of healthcare crises, emphasizing the need for adaptive leadership in such contexts. Institutional theory, on the other hand, provides insights into how external and internal institutional pressures influence leadership behavior. Findings from this study highlight the multifaceted nature of leadership adaptability, emphasizing the significance of leaders' abilities to embrace uncertainty, engage in sensemaking, and leverage the institutional environment to enact meaningful changes. Furthermore, this research sheds light on the challenges and opportunities that leaders face when adapting to crises within the UK private healthcare sector. The study's insights contribute to the growing body of literature on leadership in healthcare, offering practical implications for leaders, policymakers, and stakeholders within the UK private healthcare sector. By employing the dual perspectives of complexity theory and institutional theory, this research provides a holistic understanding of leadership adaptability in the face of crises, offering valuable guidance for enhancing the resilience and effectiveness of healthcare leadership within this vital sector.Keywords: leadership, adaptability, decision-making, complexity, complexity theory, institutional theory, organizational complexity, complex adaptive system (CAS), crises, healthcare
Procedia PDF Downloads 501623 Walking the Tightrope: Balancing Project Governance, Complexity, and Servant Leadership for Megaproject Success
Authors: Muhammad Shoaib Iqbal, Shih Ping Ho
Abstract:
Megaprojects are large-scale, complex ventures with significant financial investments, numerous stakeholders, and extended timelines, requiring meticulous management for successful completion. This study explores the interplay between project governance, project complexity, and servant leadership and their combined effects on project success, specifically within the context of Pakistani megaprojects. The primary objectives are to examine the direct impact of project governance on project success, understand the negative influence of project complexity, assess the positive role of servant leadership, explore the moderating effect of servant leadership on the relationship between governance and success, and investigate how servant leadership mitigates the adverse effects of complexity. Using a quantitative approach, survey data were collected from project managers and team members involved in Pakistani megaprojects. Using a Comprehensive empirical model, 257 Valid responses were analyzed. Multiple regression analysis tested the hypothesized relationships and interaction effects using PLS-SEM. Findings reveal that project governance significantly enhances project success, emphasizing the need for robust governance structures. Conversely, project complexity negatively impacts success, highlighting the challenges of managing complex projects. Servant leadership significantly boosts project success by prioritizing team support and empowerment. Although the interaction between governance and servant leadership is not significant, suggesting no significant change in project success, servant leadership significantly mitigates the negative effects of project complexity, enhancing team resilience and adaptability. These results underscore the necessity for a balanced approach integrating strong governance with flexible, supportive leadership. The study offers valuable insights for practitioners, recommending adaptive governance frameworks and promoting servant leadership to improve the management and success rates of megaprojects. This research contributes to the broader understanding of effective project management practices in complex environments.Keywords: project governance, project complexity, servant leadership, project success, megaprojects, Pakistan
Procedia PDF Downloads 341622 Configuring Systems to Be Viable in a Crisis: The Role of Intuitive Decision-Making
Authors: Ayham Fattoum, Simos Chari, Duncan Shaw
Abstract:
Volatile, uncertain, complex, and ambiguous (VUCA) conditions threaten systems viability with emerging and novel events requiring immediate and localized responses. Such responsiveness is only possible through devolved freedom and emancipated decision-making. The Viable System Model (VSM) recognizes the need and suggests maximizing autonomy to localize decision-making and minimize residual complexity. However, exercising delegated autonomy in VUCA requires confidence and knowledge to use intuition and guidance to maintain systemic coherence. This paper explores the role of intuition as an enabler of emancipated decision-making and autonomy under VUCA. Intuition allows decision-makers to use their knowledge and experience to respond rapidly to novel events. This paper offers three contributions to VSM. First, it designs a system model that illustrates the role of intuitive decision-making in managing complexity and maintaining viability. Second, it takes a black-box approach to theory development in VSM to model the role of autonomy and intuition. Third, the study uses a multi-stage discovery-oriented approach (DOA) to develop theory, with each stage combining literature, data analysis, and model/theory development and identifying further questions for the subsequent stage. We synthesize literature (e.g., VSM, complexity management) with seven months of field-based insights (interviews, workshops, and observation of a live disaster exercise) to develop a framework of intuitive complexity management framework and VSM models. The results have practical implications for enhancing the resilience of organizations and communities.Keywords: Intuition, complexity management, decision-making, viable system model
Procedia PDF Downloads 671621 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity
Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish
Abstract:
Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow
Procedia PDF Downloads 1321620 Functional Decomposition Based Effort Estimation Model for Software-Intensive Systems
Authors: Nermin Sökmen
Abstract:
An effort estimation model is needed for software-intensive projects that consist of hardware, embedded software or some combination of the two, as well as high level software solutions. This paper first focuses on functional decomposition techniques to measure functional complexity of a computer system and investigates its impact on system development effort. Later, it examines effects of technical difficulty and design team capability factors in order to construct the best effort estimation model. With using traditional regression analysis technique, the study develops a system development effort estimation model which takes functional complexity, technical difficulty and design team capability factors as input parameters. Finally, the assumptions of the model are tested.Keywords: functional complexity, functional decomposition, development effort, technical difficulty, design team capability, regression analysis
Procedia PDF Downloads 2931619 Visual Analytics in K 12 Education: Emerging Dimensions of Complexity
Authors: Linnea Stenliden
Abstract:
The aim of this paper is to understand emerging learning conditions, when a visual analytics is implemented and used in K 12 (education). To date, little attention has been paid to the role visual analytics (digital media and technology that highlight visual data communication in order to support analytical tasks) can play in education, and to the extent to which these tools can process actionable data for young students. This study was conducted in three public K 12 schools, in four social science classes with students aged 10 to 13 years, over a period of two to four weeks at each school. Empirical data were generated using video observations and analyzed with help of metaphors by Latour. The learning conditions are found to be distinguished by broad complexity characterized by four dimensions. These emerge from the actors’ deeply intertwined relations in the activities. The paper argues in relation to the found dimensions that novel approaches to teaching and learning could benefit students’ knowledge building as they work with visual analytics, analyzing visualized data.Keywords: analytical reasoning, complexity, data use, problem space, visual analytics, visual storytelling, translation
Procedia PDF Downloads 3761618 Examining the Changes in Complexity, Accuracy, and Fluency in Japanese L2 Writing Over an Academic Semester
Authors: Robert Long
Abstract:
The results of a one-year study on the evolution of complexity, accuracy, and fluency (CAF) in the compositions of Japanese L2 university students throughout a semester are presented in this study. One goal was to determine if any improvement in writing abilities over this academic term had occurred, while another was to examine methods of editing. Participants had 30 minutes to write each essay with an additional 10 minutes allotted for editing. As for editing, participants were divided into two groups, one of which utilized an online grammar checker, while the other half self-edited their initial manuscripts. From the three different institutions, there was a total of 159 students. Research questions focused on determining if the CAF had evolved over the previous year, identifying potential variations in editing techniques, and describing the connections between the CAF dimensions. According to the findings, there was some improvement in accuracy (fewer errors) in all three of the measures), whereas there was a marked decline in complexity and fluency. As for the second research aim relating to the interaction among the three dimensions (CAF) and of possible increases in fluency being offset by decreases in grammatical accuracy, results showed (there is a logical high correlation with clauses and word counts, and mean length of T-unit (MLT) and (coordinate phrase of T-unit (CP/T) as well as MLT and clause per T-unit (C/T); furthermore, word counts and error/100 ratio correlated highly with error-free clause totals (EFCT). Issues of syntactical complexity had a negative correlation with EFCT, indicating that more syntactical complexity relates to decreased accuracy. Concerning a difference in error correction between those who self-edited and those who used an online grammar correction tool, results indicated that the variable of errors-free clause ratios (EFCR) had the greatest difference regarding accuracy, with fewer errors noted with writers using an online grammar checker. As for possible differences between the first and second (edited) drafts regarding CAF, results indicated there were positive changes in accuracy, the most significant change seen in complexity (CP/T and MLT), while there were relatively insignificant changes in fluency. Results also indicated significant differences among the three institutions, with Fujian University of Technology having the most fluency and accuracy. These findings suggest that to raise students' awareness of their overall writing development, teachers should support them in developing more complex syntactic structures, improving their fluency, and making more effective use of online grammar checkers.Keywords: complexity, accuracy, fluency, writing
Procedia PDF Downloads 391617 MASCOT: Design and Development of an Interactive Self-Evaluation Tool for Students’ Thinking Complexity
Abstract:
'In Dialogue with Humanity’ and ‘In Dialogue with Nature’ are two compulsory General Education Foundation (GEF) courses for all undergraduates at the Chinese University of Hong Kong (CUHK). These courses aim to enrich students’ intellectual pursuits and enhance their thinking capabilities through classic readings. To better understand and evaluate students’ thinking habits and abilities, GEF introduced Narrative Qualitative Analysis (NQA) in 2014 and has continued the study since then. Through the NQA study, a two-way evaluation scheme has been developed, including both student self-evaluation and teacher evaluation. This study will first introduce the theoretical background and research framework of the NQA study and then focus on student self-evaluation. An interactive online application, MASCOT, has been developed to facilitate students’ self-evaluation of their own thinking complexity. In this presentation, the design and development of MASCOT will be explained, and the main results will be reported when applying it in classroom teaching. An obvious discrepancy has been observed between students’ self-evaluations and teachers’ evaluations.Keywords: narrative qualitative analysis, thinking complexity, student self-evaluation, interactive online application
Procedia PDF Downloads 471616 Reliability of Self-Reported Language Proficiency Measures in l1 Attrition Research: A Closer Look at the Can-Do-Scales.
Authors: Anastasia Sorokina
Abstract:
Self-reported language proficiency measures have been widely used by researchers and have been proven to be an accurate tool to assess actual language proficiency. L1 attrition researchers also rely on self-reported measures. More specifically, can-do-scales has gained popularity in the discipline of L1 attrition research. The can-do-scales usually contain statements about language (e.g., “I can write e-mails”); participants are asked to rate each statement on a scale from 1 (I cannot do it at all) to 5 (I can do it without any difficulties). Despite its popularity, no studies have examined can-do-scales’ reliability at measuring the actual level of L1 attrition. Do can-do-scales positively correlate with lexical diversity, syntactic complexity, and fluency? The present study analyzed speech samples of 35 Russian-English attriters to examine whether their self-reported proficiency correlates with their actual L1 proficiency. The results of Pearson correlation demonstrated that can-do-scales correlated with lexical diversity, syntactic complexity, and fluency. These findings provide a valuable contribution to the L1 attrition research by demonstrating that can-do-scales can be used as a reliable tool to measure L1 attrition.Keywords: L1 attrition, can-do-scales, lexical diversity, syntactic complexity
Procedia PDF Downloads 2451615 Nonlinear Analysis in Investigating the Complexity of Neurophysiological Data during Reflex Behavior
Authors: Juliana A. Knocikova
Abstract:
Methods of nonlinear signal analysis are based on finding that random behavior can arise in deterministic nonlinear systems with a few degrees of freedom. Considering the dynamical systems, entropy is usually understood as a rate of information production. Changes in temporal dynamics of physiological data are indicating evolving of system in time, thus a level of new signal pattern generation. During last decades, many algorithms were introduced to assess some patterns of physiological responses to external stimulus. However, the reflex responses are usually characterized by short periods of time. This characteristic represents a great limitation for usual methods of nonlinear analysis. To solve the problems of short recordings, parameter of approximate entropy has been introduced as a measure of system complexity. Low value of this parameter is reflecting regularity and predictability in analyzed time series. On the other side, increasing of this parameter means unpredictability and a random behavior, hence a higher system complexity. Reduced neurophysiological data complexity has been observed repeatedly when analyzing electroneurogram and electromyogram activities during defence reflex responses. Quantitative phrenic neurogram changes are also obvious during severe hypoxia, as well as during airway reflex episodes. Concluding, the approximate entropy parameter serves as a convenient tool for analysis of reflex behavior characterized by short lasting time series.Keywords: approximate entropy, neurophysiological data, nonlinear dynamics, reflex
Procedia PDF Downloads 3001614 Fast and Efficient Algorithms for Evaluating Uniform and Nonuniform Lagrange and Newton Curves
Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong
Abstract:
Newton-Lagrange Interpolations are widely used in numerical analysis. However, it requires a quadratic computational time for their constructions. In computer aided geometric design (CAGD), there are some polynomial curves: Wang-Ball, DP and Dejdumrong curves, which have linear time complexity algorithms. Thus, the computational time for Newton-Lagrange Interpolations can be reduced by applying the algorithms of Wang-Ball, DP and Dejdumrong curves. In order to use Wang-Ball, DP and Dejdumrong algorithms, first, it is necessary to convert Newton-Lagrange polynomials into Wang-Ball, DP or Dejdumrong polynomials. In this work, the algorithms for converting from both uniform and non-uniform Newton-Lagrange polynomials into Wang-Ball, DP and Dejdumrong polynomials are investigated. Thus, the computational time for representing Newton-Lagrange polynomials can be reduced into linear complexity. In addition, the other utilizations of using CAGD curves to modify the Newton-Lagrange curves can be taken.Keywords: Lagrange interpolation, linear complexity, monomial matrix, Newton interpolation
Procedia PDF Downloads 2341613 A Less Complexity Deep Learning Method for Drones Detection
Authors: Mohamad Kassab, Amal El Fallah Seghrouchni, Frederic Barbaresco, Raed Abu Zitar
Abstract:
Detecting objects such as drones is a challenging task as their relative size and maneuvering capabilities deceive machine learning models and cause them to misclassify drones as birds or other objects. In this work, we investigate applying several deep learning techniques to benchmark real data sets of flying drones. A deep learning paradigm is proposed for the purpose of mitigating the complexity of those systems. The proposed paradigm consists of a hybrid between the AdderNet deep learning paradigm and the Single Shot Detector (SSD) paradigm. The goal was to minimize multiplication operations numbers in the filtering layers within the proposed system and, hence, reduce complexity. Some standard machine learning technique, such as SVM, is also tested and compared to other deep learning systems. The data sets used for training and testing were either complete or filtered in order to remove the images with mall objects. The types of data were RGB or IR data. Comparisons were made between all these types, and conclusions were presented.Keywords: drones detection, deep learning, birds versus drones, precision of detection, AdderNet
Procedia PDF Downloads 1811612 Towards a Simulation Model to Ensure the Availability of Machines in Maintenance Activities
Authors: Maryam Gallab, Hafida Bouloiz, Youness Chater, Mohamed Tkiouat
Abstract:
The aim of this paper is to present a model based on multi-agent systems in order to manage the maintenance activities and to ensure the reliability and availability of machines just with the required resources (operators, tools). The interest of the simulation is to solve the complexity of the system and to find results without cost or wasting time. An implementation of the model is carried out on the AnyLogic platform to display the defined performance indicators.Keywords: maintenance, complexity, simulation, multi-agent systems, AnyLogic platform
Procedia PDF Downloads 3051611 A Time-Reducible Approach to Compute Determinant |I-X|
Authors: Wang Xingbo
Abstract:
Computation of determinant in the form |I-X| is primary and fundamental because it can help to compute many other determinants. This article puts forward a time-reducible approach to compute determinant |I-X|. The approach is derived from the Newton’s identity and its time complexity is no more than that to compute the eigenvalues of the square matrix X. Mathematical deductions and numerical example are presented in detail for the approach. By comparison with classical approaches the new approach is proved to be superior to the classical ones and it can naturally reduce the computational time with the improvement of efficiency to compute eigenvalues of the square matrix.Keywords: algorithm, determinant, computation, eigenvalue, time complexity
Procedia PDF Downloads 4151610 The Influence of Grammatical Gender on Socially Constructed Gender in English, Dutch, and German
Authors: Noah Brandon
Abstract:
Grammatical gender can create a restrictive roadblock for the usage of gender-inclusive language. This research describes grammatical gender structures used in English, Dutch, and German and considers how these structures restrict the implementation of gender inclusivity in spoken and written discourse. This restriction is measured by the frequency with which gender-inclusive & generic masculine forms are used and by the morphosyntactic complexity of the gender-inclusive forms available in these languages. These languages form a continuum of grammatical gender structures, with English having the least articulated structures and German having the most. This leads to a comparative analysis intended to establish a correlation between the complexity of gender structure and the difficulty of using gender-inclusive forms. English, on one side of the continuum, maintains only remnants of a formal grammatical gender system and imposes the fewest restrictions on the creation of neo-pronouns and the use of gender-inclusive alternatives to gendered agentive nouns. Next, the Dutch have a functionally two-gender system with less freedom using gender-neutral forms. Lastly, German, on the other end, has a three-gender system requiring a plethora of morphosyntactic and orthographic alternatives to avoid using generic masculine. The paper argues that the complexity of grammatical gender structures correlates with hindered use of gender-inclusive forms. Going forward, efforts will focus on gathering further data on the usage of gender-inclusive and generic masculine forms within these languages. The end goal of this research is to establish a definitive objective correlation between grammatical gender complexity and impediments in expressing socially constructed gender.Keywords: sociolinguistics, language and gender, gender, Germanic linguistics, grammatical gender, German, Dutch, English
Procedia PDF Downloads 781609 The Revenue Management Implementation and Its Complexity in the Airline Industry: An Empirical Study on the Egyptian Airline Industry
Authors: Amr Sultan, Sara Elgazzar, Breksal Elmiligy
Abstract:
The airline industry nowadays is becoming a more growing industry facing a severe competition. It is an influential issue in this context to utilize revenue management (RM) concept and practice in order to develop the pricing strategy. There is an unfathomable necessity for RM to assist the airlines and their associates to disparage the cost and recuperate their revenue, which in turn will boost the airline industry performance. The complexity of RM imposes enormous challenges on the airline industry. Several studies have been proposed on the RM adaptation in airlines industry while there is a limited availability of implementing RM and its complexity in the developing countries such as Egypt. This research represents a research schema about the implementation of the RM to the Egyptian airline industry. The research aims at investigating and demonstrating the complexities face implementing RM in the airline industry, up on which the research provides a comprehensive understanding of how to overcome these complexities while adapting RM in the Egyptian airline industry. An empirical study was conducted on the Egyptian airline sector based on a sample of four airlines (Egyptair, Britishair, KLM, and Lufthansa). The empirical study was conducted using a mix of qualitative and quantitative approaches. First, in-depth interviews were carried out to analyze the Egyptian airline sector status and the main challenges faced by the airlines. Then, a structured survey on the three different parties of airline industry; airlines, airfreight forwarders, and passengers were conducted in order to investigate the main complexity factors from different parties' points of view. Finally, a focus group was conducted to develop a best practice framework to overcome the complexities faced the RM adaptation in the Egyptian airline industry. The research provides an original contribution to knowledge by creating a framework to overcome the complexities and challenges in adapting RM in the airline industry generally and the Egyptian airline industry particularly. The framework can be used as a RM tool to increase the effectiveness and efficiency of the Egyptian airline industry performance.Keywords: revenue management, airline industry, revenue management complexity, Egyptian airline industry
Procedia PDF Downloads 4031608 Energy Absorption Capacity of Aluminium Foam Manufactured by Kelvin Model Loaded Under Different Biaxial Combined Compression-Torsion Conditions
Authors: H. Solomon, A. Abdul-Latif, R. Baleh, I. Deiab, K. Khanafer
Abstract:
Aluminum foams were developed and tested due to their high energy absorption abilities for multifunctional applications. The aim of this research work was to investigate experimentally the effect of quasi-static biaxial loading complexity (combined compression-torsion) on the energy absorption capacity of highly uniform architecture open-cell aluminum foam manufactured by kelvin cell model. The two generated aluminum foams have 80% and 85% porosities, spherical-shaped pores having 11mm in diameter. These foams were tested by means of several square-section specimens. A patented rig called ACTP (Absorption par Compression-Torsion Plastique), was used to investigate the foam response under quasi-static complex loading paths having different torsional components (i.e., 0°, 37° and 53°). The main mechanical responses of the aluminum foams were studied under simple, intermediate and severe loading conditions. In fact, the key responses to be examined were stress plateau and energy absorption capacity of the two foams with respect to loading complexity. It was concluded that the higher the loading complexity and the higher the relative density, the greater the energy absorption capacity of the foam. The highest energy absorption was thus recorded under the most complicated loading path (i.e., biaxial-53°) for the denser foam (i.e., 80% porosity).Keywords: open-cell aluminum foams, biaxial loading complexity, foams porosity, energy absorption capacity, characterization
Procedia PDF Downloads 1301607 Analysis of Cardiac Health Using Chaotic Theory
Authors: Chandra Mukherjee
Abstract:
The prevalent knowledge of the biological systems is based on the standard scientific perception of natural equilibrium, determination and predictability. Recently, a rethinking of concepts was presented and a new scientific perspective emerged that involves complexity theory with deterministic chaos theory, nonlinear dynamics and theory of fractals. The unpredictability of the chaotic processes probably would change our understanding of diseases and their management. The mathematical definition of chaos is defined by deterministic behavior with irregular patterns that obey mathematical equations which are critically dependent on initial conditions. The chaos theory is the branch of sciences with an interest in nonlinear dynamics, fractals, bifurcations, periodic oscillations and complexity. Recently, the biomedical interest for this scientific field made these mathematical concepts available to medical researchers and practitioners. Any biological network system is considered to have a nominal state, which is recognized as a homeostatic state. In reality, the different physiological systems are not under normal conditions in a stable state of homeostatic balance, but they are in a dynamically stable state with a chaotic behavior and complexity. Biological systems like heart rhythm and brain electrical activity are dynamical systems that can be classified as chaotic systems with sensitive dependence on initial conditions. In biological systems, the state of a disease is characterized by a loss of the complexity and chaotic behavior, and by the presence of pathological periodicity and regulatory behavior. The failure or the collapse of nonlinear dynamics is an indication of disease rather than a characteristic of health.Keywords: HRV, HRVI, LF, HF, DII
Procedia PDF Downloads 4251606 A Gradient Orientation Based Efficient Linear Interpolation Method
Authors: S. Khan, A. Khan, Abdul R. Soomrani, Raja F. Zafar, A. Waqas, G. Akbar
Abstract:
This paper proposes a low-complexity image interpolation method. Image interpolation is used to convert a low dimension video/image to high dimension video/image. The objective of a good interpolation method is to upscale an image in such a way that it provides better edge preservation at the cost of very low complexity so that real-time processing of video frames can be made possible. However, low complexity methods tend to provide real-time interpolation at the cost of blurring, jagging and other artifacts due to errors in slope calculation. Non-linear methods, on the other hand, provide better edge preservation, but at the cost of high complexity and hence they can be considered very far from having real-time interpolation. The proposed method is a linear method that uses gradient orientation for slope calculation, unlike conventional linear methods that uses the contrast of nearby pixels. Prewitt edge detection is applied to separate uniform regions and edges. Simple line averaging is applied to unknown uniform regions, whereas unknown edge pixels are interpolated after calculation of slopes using gradient orientations of neighboring known edge pixels. As a post-processing step, bilateral filter is applied to interpolated edge regions in order to enhance the interpolated edges.Keywords: edge detection, gradient orientation, image upscaling, linear interpolation, slope tracing
Procedia PDF Downloads 2601605 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models
Authors: Salah Alrabeei, Mohammad Yousuf
Abstract:
The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.Keywords: integral differential equations, jump–diffusion model, American options, rational approximation
Procedia PDF Downloads 1181604 Analysis of Diabetes Patients Using Pearson, Cost Optimization, Control Chart Methods
Authors: Devatha Kalyan Kumar, R. Poovarasan
Abstract:
In this paper, we have taken certain important factors and health parameters of diabetes patients especially among children by birth (pediatric congenital) where using the above three metrics methods we are going to assess the importance of each attributes in the dataset and thereby determining the most highly responsible and co-related attribute causing diabetics among young patients. We use cost optimization, control chart and Spearmen methodologies for the real-time application of finding the data efficiency in this diabetes dataset. The Spearmen methodology is the correlation methodologies used in software development process to identify the complexity between the various modules of the software. Identifying the complexity is important because if the complexity is higher, then there is a higher chance of occurrence of the risk in the software. With the use of control; chart mean, variance and standard deviation of data are calculated. With the use of Cost optimization model, we find to optimize the variables. Hence we choose the Spearmen, control chart and cost optimization methods to assess the data efficiency in diabetes datasets.Keywords: correlation, congenital diabetics, linear relationship, monotonic function, ranking samples, pediatric
Procedia PDF Downloads 256