Search results for: process complexity
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16074

Search results for: process complexity

15984 Towards a Simulation Model to Ensure the Availability of Machines in Maintenance Activities

Authors: Maryam Gallab, Hafida Bouloiz, Youness Chater, Mohamed Tkiouat

Abstract:

The aim of this paper is to present a model based on multi-agent systems in order to manage the maintenance activities and to ensure the reliability and availability of machines just with the required resources (operators, tools). The interest of the simulation is to solve the complexity of the system and to find results without cost or wasting time. An implementation of the model is carried out on the AnyLogic platform to display the defined performance indicators.

Keywords: maintenance, complexity, simulation, multi-agent systems, AnyLogic platform

Procedia PDF Downloads 284
15983 A Time-Reducible Approach to Compute Determinant |I-X|

Authors: Wang Xingbo

Abstract:

Computation of determinant in the form |I-X| is primary and fundamental because it can help to compute many other determinants. This article puts forward a time-reducible approach to compute determinant |I-X|. The approach is derived from the Newton’s identity and its time complexity is no more than that to compute the eigenvalues of the square matrix X. Mathematical deductions and numerical example are presented in detail for the approach. By comparison with classical approaches the new approach is proved to be superior to the classical ones and it can naturally reduce the computational time with the improvement of efficiency to compute eigenvalues of the square matrix.

Keywords: algorithm, determinant, computation, eigenvalue, time complexity

Procedia PDF Downloads 395
15982 The Influence of Grammatical Gender on Socially Constructed Gender in English, Dutch, and German

Authors: Noah Brandon

Abstract:

Grammatical gender can create a restrictive roadblock for the usage of gender-inclusive language. This research describes grammatical gender structures used in English, Dutch, and German and considers how these structures restrict the implementation of gender inclusivity in spoken and written discourse. This restriction is measured by the frequency with which gender-inclusive & generic masculine forms are used and by the morphosyntactic complexity of the gender-inclusive forms available in these languages. These languages form a continuum of grammatical gender structures, with English having the least articulated structures and German having the most. This leads to a comparative analysis intended to establish a correlation between the complexity of gender structure and the difficulty of using gender-inclusive forms. English, on one side of the continuum, maintains only remnants of a formal grammatical gender system and imposes the fewest restrictions on the creation of neo-pronouns and the use of gender-inclusive alternatives to gendered agentive nouns. Next, the Dutch have a functionally two-gender system with less freedom using gender-neutral forms. Lastly, German, on the other end, has a three-gender system requiring a plethora of morphosyntactic and orthographic alternatives to avoid using generic masculine. The paper argues that the complexity of grammatical gender structures correlates with hindered use of gender-inclusive forms. Going forward, efforts will focus on gathering further data on the usage of gender-inclusive and generic masculine forms within these languages. The end goal of this research is to establish a definitive objective correlation between grammatical gender complexity and impediments in expressing socially constructed gender.

Keywords: sociolinguistics, language and gender, gender, Germanic linguistics, grammatical gender, German, Dutch, English

Procedia PDF Downloads 56
15981 The Revenue Management Implementation and Its Complexity in the Airline Industry: An Empirical Study on the Egyptian Airline Industry

Authors: Amr Sultan, Sara Elgazzar, Breksal Elmiligy

Abstract:

The airline industry nowadays is becoming a more growing industry facing a severe competition. It is an influential issue in this context to utilize revenue management (RM) concept and practice in order to develop the pricing strategy. There is an unfathomable necessity for RM to assist the airlines and their associates to disparage the cost and recuperate their revenue, which in turn will boost the airline industry performance. The complexity of RM imposes enormous challenges on the airline industry. Several studies have been proposed on the RM adaptation in airlines industry while there is a limited availability of implementing RM and its complexity in the developing countries such as Egypt. This research represents a research schema about the implementation of the RM to the Egyptian airline industry. The research aims at investigating and demonstrating the complexities face implementing RM in the airline industry, up on which the research provides a comprehensive understanding of how to overcome these complexities while adapting RM in the Egyptian airline industry. An empirical study was conducted on the Egyptian airline sector based on a sample of four airlines (Egyptair, Britishair, KLM, and Lufthansa). The empirical study was conducted using a mix of qualitative and quantitative approaches. First, in-depth interviews were carried out to analyze the Egyptian airline sector status and the main challenges faced by the airlines. Then, a structured survey on the three different parties of airline industry; airlines, airfreight forwarders, and passengers were conducted in order to investigate the main complexity factors from different parties' points of view. Finally, a focus group was conducted to develop a best practice framework to overcome the complexities faced the RM adaptation in the Egyptian airline industry. The research provides an original contribution to knowledge by creating a framework to overcome the complexities and challenges in adapting RM in the airline industry generally and the Egyptian airline industry particularly. The framework can be used as a RM tool to increase the effectiveness and efficiency of the Egyptian airline industry performance.

Keywords: revenue management, airline industry, revenue management complexity, Egyptian airline industry

Procedia PDF Downloads 370
15980 Applying Concurrent Development Process for the Web Using Aspect-Oriented Approach

Authors: Hiroaki Fukuda

Abstract:

This paper shows a concurrent development process for modern web application, called Rich Internet Application (RIA), and describes its effect using a non-trivial application development. In the last years, RIAs such as Ajax and Flex have become popular based mainly on high-speed network. RIA provides sophisticated interfaces and user experiences, therefore, the development of RIA requires two kinds of engineer: a developer who implements business logic, and a designer who designs interface and experiences. Although collaborative works are becoming important for the development of RIAs, shared resources such as source code make it difficult. For example, if a design of interface is modified after developers have finished business logic implementations, they need to repeat the same implementations, and also tests to verify application’s behavior. MVC architecture and Object-oriented programming (OOP) enables to dividing an application into modules such as interfaces and logic, however, developers and/or designers have to write pieces of code (e.g., event handlers) that make these modules work as an application. On the other hand, Aspect-oriented programming (AOP) is ex- pected to solve complexity of application software development nowadays. AOP provides methods to separate crosscutting concerns that are scattered pieces of code from primary concerns. In this paper, we provide a concurrent development process for RIAs by introducing AOP concept. This process makes it possible to reduce shared resources between developers and designers, therefore they can perform their tasks concurrently. In addition, we describe experiences of development for a practical application using our proposed development process to show its availability.

Keywords: aspect-oriented programming, concurrent, development process, rich internet application

Procedia PDF Downloads 284
15979 Energy Absorption Capacity of Aluminium Foam Manufactured by Kelvin Model Loaded Under Different Biaxial Combined Compression-Torsion Conditions

Authors: H. Solomon, A. Abdul-Latif, R. Baleh, I. Deiab, K. Khanafer

Abstract:

Aluminum foams were developed and tested due to their high energy absorption abilities for multifunctional applications. The aim of this research work was to investigate experimentally the effect of quasi-static biaxial loading complexity (combined compression-torsion) on the energy absorption capacity of highly uniform architecture open-cell aluminum foam manufactured by kelvin cell model. The two generated aluminum foams have 80% and 85% porosities, spherical-shaped pores having 11mm in diameter. These foams were tested by means of several square-section specimens. A patented rig called ACTP (Absorption par Compression-Torsion Plastique), was used to investigate the foam response under quasi-static complex loading paths having different torsional components (i.e., 0°, 37° and 53°). The main mechanical responses of the aluminum foams were studied under simple, intermediate and severe loading conditions. In fact, the key responses to be examined were stress plateau and energy absorption capacity of the two foams with respect to loading complexity. It was concluded that the higher the loading complexity and the higher the relative density, the greater the energy absorption capacity of the foam. The highest energy absorption was thus recorded under the most complicated loading path (i.e., biaxial-53°) for the denser foam (i.e., 80% porosity).

Keywords: open-cell aluminum foams, biaxial loading complexity, foams porosity, energy absorption capacity, characterization

Procedia PDF Downloads 99
15978 Analysis of Cardiac Health Using Chaotic Theory

Authors: Chandra Mukherjee

Abstract:

The prevalent knowledge of the biological systems is based on the standard scientific perception of natural equilibrium, determination and predictability. Recently, a rethinking of concepts was presented and a new scientific perspective emerged that involves complexity theory with deterministic chaos theory, nonlinear dynamics and theory of fractals. The unpredictability of the chaotic processes probably would change our understanding of diseases and their management. The mathematical definition of chaos is defined by deterministic behavior with irregular patterns that obey mathematical equations which are critically dependent on initial conditions. The chaos theory is the branch of sciences with an interest in nonlinear dynamics, fractals, bifurcations, periodic oscillations and complexity. Recently, the biomedical interest for this scientific field made these mathematical concepts available to medical researchers and practitioners. Any biological network system is considered to have a nominal state, which is recognized as a homeostatic state. In reality, the different physiological systems are not under normal conditions in a stable state of homeostatic balance, but they are in a dynamically stable state with a chaotic behavior and complexity. Biological systems like heart rhythm and brain electrical activity are dynamical systems that can be classified as chaotic systems with sensitive dependence on initial conditions. In biological systems, the state of a disease is characterized by a loss of the complexity and chaotic behavior, and by the presence of pathological periodicity and regulatory behavior. The failure or the collapse of nonlinear dynamics is an indication of disease rather than a characteristic of health.

Keywords: HRV, HRVI, LF, HF, DII

Procedia PDF Downloads 399
15977 Automated Manual Handling Risk Assessments: Practitioner Experienced Determinants of Automated Risk Analysis and Reporting Being a Benefit or Distraction

Authors: S. Cowley, M. Lawrance, D. Bick, R. McCord

Abstract:

Technology that automates manual handling (musculoskeletal disorder or MSD) risk assessments is increasingly available to ergonomists, engineers, generalist health and safety practitioners alike. The risk assessment process is generally based on the use of wearable motion sensors that capture information about worker movements for real-time or for posthoc analysis. Traditionally, MSD risk assessment is undertaken with the assistance of a checklist such as that from the SafeWork Australia code of practice, the expert assessor observing the task and ideally engaging with the worker in a discussion about the detail. Automation enables the non-expert to complete assessments and does not always require the assessor to be there. This clearly has cost and time benefits for the practitioner but is it an improvement on the assessment by the human. Human risk assessments draw on the knowledge and expertise of the assessor but, like all risk assessments, are highly subjective. The complexity of the checklists and models used in the process can be off-putting and sometimes will lead to the assessment becoming the focus and the end rather than a means to an end; the focus on risk control is lost. Automated risk assessment handles the complexity of the assessment for the assessor and delivers a simple risk score that enables decision-making regarding risk control. Being machine-based, they are objective and will deliver the same each time they assess an identical task. However, the WHS professional needs to know that this emergent technology asks the right questions and delivers the right answers. Whether it improves the risk assessment process and results or simply distances the professional from the task and the worker. They need clarity as to whether automation of manual task risk analysis and reporting leads to risk control or to a focus on the worker. Critically, they need evidence as to whether automation in this area of hazard management leads to better risk control or just a bigger collection of assessments. Practitioner experienced determinants of this automated manual task risk analysis and reporting being a benefit or distraction will address an understanding of emergent risk assessment technology, its use and things to consider when making decisions about adopting and applying these technologies.

Keywords: automated, manual-handling, risk-assessment, machine-based

Procedia PDF Downloads 99
15976 A Gradient Orientation Based Efficient Linear Interpolation Method

Authors: S. Khan, A. Khan, Abdul R. Soomrani, Raja F. Zafar, A. Waqas, G. Akbar

Abstract:

This paper proposes a low-complexity image interpolation method. Image interpolation is used to convert a low dimension video/image to high dimension video/image. The objective of a good interpolation method is to upscale an image in such a way that it provides better edge preservation at the cost of very low complexity so that real-time processing of video frames can be made possible. However, low complexity methods tend to provide real-time interpolation at the cost of blurring, jagging and other artifacts due to errors in slope calculation. Non-linear methods, on the other hand, provide better edge preservation, but at the cost of high complexity and hence they can be considered very far from having real-time interpolation. The proposed method is a linear method that uses gradient orientation for slope calculation, unlike conventional linear methods that uses the contrast of nearby pixels. Prewitt edge detection is applied to separate uniform regions and edges. Simple line averaging is applied to unknown uniform regions, whereas unknown edge pixels are interpolated after calculation of slopes using gradient orientations of neighboring known edge pixels. As a post-processing step, bilateral filter is applied to interpolated edge regions in order to enhance the interpolated edges.

Keywords: edge detection, gradient orientation, image upscaling, linear interpolation, slope tracing

Procedia PDF Downloads 244
15975 A Study on Unix Process Crash Based on Efficient Process Management Method

Authors: Guo Haonan, Chen Peiyu, Zhao Hanyu, Burra Venkata Durga Kumar

Abstract:

Unix and Unix-like operating systems are widely used due to their high stability but are limited by the parent-child process structure, and the child process depends on the parent process, so the crash of a single process may cause the entire process group or even the entire system to fail. Another possibility of unexpected process termination is that the system administrator inadvertently closed the terminal or pseudo-terminal where the application was launched, causing the application process to terminate unexpectedly. This paper mainly analyzes the reasons for the problems and proposes two solutions.

Keywords: process management, daemon, login-bash and non-login bash, process group

Procedia PDF Downloads 113
15974 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.

Keywords: integral differential equations, jump–diffusion model, American options, rational approximation

Procedia PDF Downloads 94
15973 Healthcare Big Data Analytics Using Hadoop

Authors: Chellammal Surianarayanan

Abstract:

Healthcare industry is generating large amounts of data driven by various needs such as record keeping, physician’s prescription, medical imaging, sensor data, Electronic Patient Record(EPR), laboratory, pharmacy, etc. Healthcare data is so big and complex that they cannot be managed by conventional hardware and software. The complexity of healthcare big data arises from large volume of data, the velocity with which the data is accumulated and different varieties such as structured, semi-structured and unstructured nature of data. Despite the complexity of big data, if the trends and patterns that exist within the big data are uncovered and analyzed, higher quality healthcare at lower cost can be provided. Hadoop is an open source software framework for distributed processing of large data sets across clusters of commodity hardware using a simple programming model. The core components of Hadoop include Hadoop Distributed File System which offers way to store large amount of data across multiple machines and MapReduce which offers way to process large data sets with a parallel, distributed algorithm on a cluster. Hadoop ecosystem also includes various other tools such as Hive (a SQL-like query language), Pig (a higher level query language for MapReduce), Hbase(a columnar data store), etc. In this paper an analysis has been done as how healthcare big data can be processed and analyzed using Hadoop ecosystem.

Keywords: big data analytics, Hadoop, healthcare data, towards quality healthcare

Procedia PDF Downloads 385
15972 The Design and Implementation of an Enhanced 2D Mesh Switch

Authors: Manel Langar, Riad Bourguiba, Jaouhar Mouine

Abstract:

In this paper, we propose the design and implementation of an enhanced wormhole virtual channel on chip router. It is a heart of a mesh NoC using the XY deterministic routing algorithm. It is characterized by its simple virtual channel allocation strategy which allows reducing area and complexity of connections without affecting the performance. We implemented our router on a Tezzaron process to validate its performances. This router is a basic element that will be used later to design a 3D mesh NoC.

Keywords: NoC, mesh, router, 3D NoC

Procedia PDF Downloads 538
15971 Metropolitan Governance in Statutory Plan Making Process

Authors: Vibhore Bakshi

Abstract:

This research paper is a step towards understanding the role of governance in the plan preparation process. It addresses the complexities of the peri-urban, historical constructions, politics and policies of sustainability, and legislative frameworks. The paper reflects on the Delhi NCT as one of the classical cases that have happened to witness different structural changes in the master plan around 1981, 2001, 2021, and Proposed Draft 2041. The Delhi Landsat imageries for 1989 and 2018 show an increase in the built-up areas around the periphery of NCT. The peri-urbanization has been a result of increasing in-migration to peri–urban areas of Delhi. The built-up extraction for years 1981, 1991, 2001, 2011, and 2018 highlights the growing peri-urbanization on scarce land therefore, it becomes equally important to research the history of the land and its legislative measures. It is interesting to understand the streaks of changes that have occurred in the land of Delhi in accordance with the different master plans and land legislative policies. The process of masterplan process in Delhi has experienced a lot of complexities in juxtaposition to other metropolitan regions of the world. The paper identifies the shortcomings in the current master planning process approach in regard to the stage of the planning process, traditional planning approach, and lagging ICT-based interventions. The metropolitan governance systems across the globe and India depict diversity in the organizational setup and varied dissemination of functions. It addresses the complexity of the peri-urban, historical constructions, politics and policies of sustainability, and legislative frameworks.

Keywords: governance, land provisions, built-up areas, in migration, built up extraction, master planning process, legislative policies, metropolitan governance systems

Procedia PDF Downloads 154
15970 Low Complexity Carrier Frequency Offset Estimation for Cooperative Orthogonal Frequency Division Multiplexing Communication Systems without Cyclic Prefix

Authors: Tsui-Tsai Lin

Abstract:

Cooperative orthogonal frequency division multiplexing (OFDM) transmission, which possesses the advantages of better connectivity, expanded coverage, and resistance to frequency selective fading, has been a more powerful solution for the physical layer in wireless communications. However, such a hybrid scheme suffers from the carrier frequency offset (CFO) effects inherited from the OFDM-based systems, which lead to a significant degradation in performance. In addition, insertion of a cyclic prefix (CP) at each symbol block head for combating inter-symbol interference will lead to a reduction in spectral efficiency. The design on the CFO estimation for the cooperative OFDM system without CP is a suspended problem. This motivates us to develop a low complexity CFO estimator for the cooperative OFDM decode-and-forward (DF) communication system without CP over the multipath fading channel. Especially, using a block-type pilot, the CFO estimation is first derived in accordance with the least square criterion. A reliable performance can be obtained through an exhaustive two-dimensional (2D) search with a penalty of heavy computational complexity. As a remedy, an alternative solution realized with an iteration approach is proposed for the CFO estimation. In contrast to the 2D-search estimator, the iterative method enjoys the advantage of the substantially reduced implementation complexity without sacrificing the estimate performance. Computer simulations have been presented to demonstrate the efficacy of the proposed CFO estimation.

Keywords: cooperative transmission, orthogonal frequency division multiplexing (OFDM), carrier frequency offset, iteration

Procedia PDF Downloads 249
15969 Parametric Design as an Approach to Respond to Complexity

Authors: Sepideh Jabbari Behnam, Zahrasadat Saide Zarabadi

Abstract:

A city is an intertwined texture from the relationship of different components in a whole which is united in a one, so designing the whole complex and its planning is not an easy matter. By considering that a city is a complex system with infinite components and communications, providing flexible layouts that can respond to the unpredictable character of the city, which is a result of its complexity, is inevitable. Parametric design approach as a new approach can produce flexible and transformative layouts in any stage of design. This study aimed to introduce parametric design as a modern approach to respond to complex urban issues by using descriptive and analytical methods. This paper firstly introduces complex systems and then giving a brief characteristic of complex systems. The flexible design and layout flexibility is another matter in response and simulation of complex urban systems that should be considered in design, which is discussed in this study. In this regard, after describing the nature of the parametric approach as a flexible approach, as well as a tool and appropriate way to respond to features such as limited predictability, reciprocating nature, complex communications, and being sensitive to initial conditions and hierarchy, this paper introduces parametric design.

Keywords: complexity theory, complex system, flexibility, parametric design

Procedia PDF Downloads 342
15968 Modified Model-Based Systems Engineering Driven Approach for Defining Complex Energy Systems

Authors: Akshay S. Dalvi, Hazim El-Mounayri

Abstract:

The internal and the external interactions between the complex structural and behavioral characteristics of the complex energy system result in unpredictable emergent behaviors. These emergent behaviors are not well understood, especially when modeled using the traditional top-down systems engineering approach. The intrinsic nature of current complex energy systems has called for an elegant solution that provides an integrated framework in Model-Based Systems Engineering (MBSE). This paper mainly presents a MBSE driven approach to define and handle the complexity that arises due to emergent behaviors. The approach provides guidelines for developing system architecture that leverages in predicting the complexity index of the system at different levels of abstraction. A framework that integrates indefinite and definite modeling aspects is developed to determine the complexity that arises during the development phase of the system. This framework provides a workflow for modeling complex systems using Systems Modeling Language (SysML) that captures the system’s requirements, behavior, structure, and analytical aspects at both problem definition and solution levels. A system architecture for a district cooling plant is presented, which demonstrates the ability to predict the complexity index. The result suggests that complex energy systems like district cooling plant can be defined in an elegant manner using the unconventional modified MBSE driven approach that helps in estimating development time and cost.

Keywords: district cooling plant, energy systems, framework, MBSE

Procedia PDF Downloads 112
15967 Improving Student Programming Skills in Introductory Computer and Data Science Courses Using Generative AI

Authors: Genady Grabarnik, Serge Yaskolko

Abstract:

Generative Artificial Intelligence (AI) has significantly expanded its applicability with the incorporation of Large Language Models (LLMs) and become a technology with promise to automate some areas that were very difficult to automate before. The paper describes the introduction of generative Artificial Intelligence into Introductory Computer and Data Science courses and analysis of effect of such introduction. The generative Artificial Intelligence is incorporated in the educational process two-fold: For the instructors, we create templates of prompts for generation of tasks, and grading of the students work, including feedback on the submitted assignments. For the students, we introduce them to basic prompt engineering, which in turn will be used for generation of test cases based on description of the problems, generating code snippets for the single block complexity programming, and partitioning into such blocks of an average size complexity programming. The above-mentioned classes are run using Large Language Models, and feedback from instructors and students and courses’ outcomes are collected. The analysis shows statistically significant positive effect and preference of both stakeholders.

Keywords: introductory computer and data science education, generative AI, large language models, application of LLMS to computer and data science education

Procedia PDF Downloads 40
15966 2D Hexagonal Cellular Automata: The Complexity of Forms

Authors: Vural Erdogan

Abstract:

We created two-dimensional hexagonal cellular automata to obtain complexity by using simple rules same as Conway’s game of life. Considering the game of life rules, Wolfram's works about life-like structures and John von Neumann's self-replication, self-maintenance, self-reproduction problems, we developed 2-states and 3-states hexagonal growing algorithms that reach large populations through random initial states. Unlike the game of life, we used six neighbourhoods cellular automata instead of eight or four neighbourhoods. First simulations explained that whether we are able to obtain sort of oscillators, blinkers, and gliders. Inspired by Wolfram's 1D cellular automata complexity and life-like structures, we simulated 2D synchronous, discrete, deterministic cellular automata to reach life-like forms with 2-states cells. The life-like formations and the oscillators have been explained how they contribute to initiating self-maintenance together with self-reproduction and self-replication. After comparing simulation results, we decided to develop the algorithm for another step. Appending a new state to the same algorithm, which we used for reaching life-like structures, led us to experiment new branching and fractal forms. All these studies tried to demonstrate that complex life forms might come from uncomplicated rules.

Keywords: hexagonal cellular automata, self-replication, self-reproduction, self- maintenance

Procedia PDF Downloads 130
15965 Mining Diagnostic Investigation Process

Authors: Sohail Imran, Tariq Mahmood

Abstract:

In complex healthcare diagnostic investigation process, medical practitioners have to focus on ways to standardize their processes to perform high quality care and optimize the time and costs. Process mining techniques can be applied to extract process related knowledge from data without considering causal and dynamic dependencies in business domain and processes. The application of process mining is effective in diagnostic investigation. It is very helpful where a treatment gives no dispositive evidence favoring it. In this paper, we applied process mining to discover important process flow of diagnostic investigation for hepatitis patients. This approach has some benefits which can enhance the quality and efficiency of diagnostic investigation processes.

Keywords: process mining, healthcare, diagnostic investigation process, process flow

Procedia PDF Downloads 498
15964 The Influence of Design Complexity of a Building Structure on the Expected Performance

Authors: Ormal Lishi

Abstract:

This research presents a computationally efficient probabilistic method to assess the performance of compartmentation walls with similar Fire Resistance Levels (FRL) but varying complexity. Specifically, a masonry brick wall and a light-steel framed (LSF) wall with comparable insulation performance are analyzed. A Monte Carlo technique, employing Latin Hypercube Sampling (LHS), is utilized to quantify uncertainties and determine the probability of failure for both walls exposed to standard and parametric fires, following ISO 834 and Eurocodes guidelines. Results show that the probability of failure for the brick masonry wall under standard fire exposure is estimated at 4.8%, while the LSF wall is 7.6%. These probabilities decrease to 0.4% and 4.8%, respectively, when subjected to parametric fires. Notably, the complex LSF wall exhibits higher variability in predicting time to failure for specific criteria compared to the less complex brick wall, especially at higher temperatures. The proposed approach highlights the need for Probabilistic Risk Assessment (PRA) to accurately evaluate the reliability and safety levels of complex designs.

Keywords: design complexity, probability of failure, monte carlo analysis, compartmentation walls, insulation

Procedia PDF Downloads 42
15963 Enhanced Face Recognition with Daisy Descriptors Using 1BT Based Registration

Authors: Sevil Igit, Merve Meric, Sarp Erturk

Abstract:

In this paper, it is proposed to improve Daisy descriptor based face recognition using a novel One-Bit Transform (1BT) based pre-registration approach. The 1BT based pre-registration procedure is fast and has low computational complexity. It is shown that the face recognition accuracy is improved with the proposed approach. The proposed approach can facilitate highly accurate face recognition using DAISY descriptor with simple matching and thereby facilitate a low-complexity approach.

Keywords: face recognition, Daisy descriptor, One-Bit Transform, image registration

Procedia PDF Downloads 344
15962 Taleb's Complexity Theory Concept of 'Antifragility' Has a Significant Contribution to Make to Positive Psychology as Applied to Wellbeing

Authors: Claudius Peter Van Wyk

Abstract:

Given the increasingly manifest phenomena, as described in complexity theory, of volatility, uncertainty, complexity and ambiguity (VUCA), Taleb's notion of 'antifragility, has a significant contribution to make to positive psychology applied to wellbeing. Antifragility is argued to be fundamentally different from the concepts of resiliency; as the ability to recover from failure, and robustness; as the ability to resist failure. Rather it describes the capacity to reorganise in the face of stress in such a way as to cope more effectively with systemic challenges. The concept, which has been applied in disciplines ranging from physics, molecular biology, planning, engineering, and computer science, can now be considered for its application in individual human and social wellbeing. There are strong correlations to Antonovsky's model of 'salutogenesis' in which an attitude and competencies are developed of transforming burdening factors into greater resourcefulness. We demonstrate, from the perspective of neuroscience, how technology measuring nervous system coherence can be coupled to acquired psychodynamic approaches to not only identify contextual stressors, utilise biofeedback instruments for facilitating greater coherence, but apply these insights to specific life stressors that compromise well-being. Employing an on-going case study with BMW South Africa, the neurological mapping is demonstrated together with 'reframing' and emotional anchoring techniques from neurolinguistic programming. The argument is contextualised in the discipline of psychoneuroimmunology which describes the stress pathways from the CNS and endocrine systems and their impact on immune function and the capacity to restore homeostasis.

Keywords: antifragility, complexity, neuroscience, psychoneuroimmunology, salutogenesis, volatility

Procedia PDF Downloads 350
15961 An Approach to Automate the Modeling of Life Cycle Inventory Data: Case Study on Electrical and Electronic Equipment Products

Authors: Axelle Bertrand, Tom Bauer, Carole Charbuillet, Martin Bonte, Marie Voyer, Nicolas Perry

Abstract:

The complexity of Life Cycle Assessment (LCA) can be identified as the ultimate obstacle to massification. Due to these obstacles, the diffusion of eco-design and LCA methods in the manufacturing sectors could be impossible. This article addresses the research question: How to adapt the LCA method to generalize it massively and improve its performance? This paper aims to develop an approach for automating LCA in order to carry out assessments on a massive scale. To answer this, we proceeded in three steps: First, an analysis of the literature to identify existing automation methods. Given the constraints of large-scale manual processing, it was necessary to define a new approach, drawing inspiration from certain methods and combining them with new ideas and improvements. In a second part, our development of automated construction is presented (reconciliation and implementation of data). Finally, the LCA case study of a conduit is presented to demonstrate the feature-based approach offered by the developed tool. A computerized environment supports effective and efficient decision-making related to materials and processes, facilitating the process of data mapping and hence product modeling. This method is also able to complete the LCA process on its own within minutes. Thus, the calculations and the LCA report are automatically generated. The tool developed has shown that automation by code is a viable solution to meet LCA's massification objectives. It has major advantages over the traditional LCA method and overcomes the complexity of LCA. Indeed, the case study demonstrated the time savings associated with this methodology and, therefore, the opportunity to increase the number of LCA reports generated and, therefore, to meet regulatory requirements. Moreover, this approach also presents the potential of the proposed method for a wide range of applications.

Keywords: automation, EEE, life cycle assessment, life cycle inventory, massively

Procedia PDF Downloads 63
15960 Membrane Bioreactor versus Activated Sludge Process for Aerobic Wastewater Treatment and Recycling

Authors: Sarra Kitanou

Abstract:

Membrane bioreactor (MBR) systems are one of the most widely used wastewater treatment processes for various municipal and industrial waste streams. It is based on complex interactions between biological processes, filtration process and rheological properties of the liquid to be treated. Its complexity makes understanding system operation and optimization more difficult, and traditional methods based on experimental analysis are costly and time consuming. The present study was based on an external membrane bioreactor pilot scale with ceramic membranes compared to conventional activated sludge process (ASP) plant. Both systems received their influent from a domestic wastewater. The membrane bioreactor (MBR) produced an effluent with much better quality than ASP in terms of total suspended solids (TSS), organic matter such as biological oxygen demand (BOD) and chemical oxygen demand (COD), total Phosphorus and total Nitrogen. Other effluent quality parameters also indicate substantial differences between ASP and MBR. This study leads to conclude that in the case domestic wastewater, MBR treatment has excellent effluent quality. Hence, the replacement of the ASP by the MBRs may be justified on the basis of their improved removal of solids, nutrients, and micropollutants. Furthermore, in terms of reuse the great quality of the treated water allows it to be reused for irrigation.

Keywords: aerobic wastewater treatment, conventional activated sludge process, membrane bioreactor, reuse for irrigation

Procedia PDF Downloads 57
15959 Potentials of Additive Manufacturing: An Approach to Increase the Flexibility of Production Systems

Authors: A. Luft, S. Bremen, N. Balc

Abstract:

The task of flexibility planning and design, just like factory planning, for example, is to create the long-term systemic framework that constitutes the restriction for short-term operational management. This is a strategic challenge since, due to the decision defect character of the underlying flexibility problem, multiple types of flexibility need to be considered over the course of various scenarios, production programs, and production system configurations. In this context, an evaluation model has been developed that integrates both conventional and additive resources on a basic task level and allows the quantification of flexibility enhancement in terms of mix and volume flexibility, complexity reduction, and machine capacity. The model helps companies to decide in early decision-making processes about the potential gains of implementing additive manufacturing technologies on a strategic level. For companies, it is essential to consider both additive and conventional manufacturing beyond pure unit costs. It is necessary to achieve an integrative view of manufacturing that incorporates both additive and conventional manufacturing resources and quantifies their potential with regard to flexibility and manufacturing complexity. This also requires a structured process for the strategic production systems design that spans the design of various scenarios and allows for multi-dimensional and comparative analysis. A respective guideline for the planning of additive resources on a strategic level is being laid out in this paper.

Keywords: additive manufacturing, production system design, flexibility enhancement, strategic guideline

Procedia PDF Downloads 100
15958 Reduced Complexity Iterative Solution For I/Q Imbalance Problem in DVB-T2 Systems

Authors: Karim S. Hassan, Hisham M. Hamed, Yassmine A. Fahmy, Ahmed F. Shalash

Abstract:

The mismatch between in-phase and quadrature signals in Orthogonal frequency division multiplexing (OFDM) systems, such as DVB-T2, results in a severe degradation in performance. Several general solutions have been proposed in the past, but these are largely computationally intensive, leading to complex implementations. In this paper, we propose a relatively simple iterative solution, which provides good results in relatively few iterations, using fixed precision arithmetic. An additional advantage is that complex digital blocks, such as dividers and square root, are not required. Thus, the proposed solution may be implemented in relatively simple hardware.

Keywords: OFDM, DVB-T2, I/Q imbalance, I/Q mismatch, iterative method, fixed point, reduced complexity

Procedia PDF Downloads 516
15957 Transferring of Digital DIY Potentialities through a Co-Design Tool

Authors: Marita Canina, Carmen Bruno

Abstract:

Digital Do It Yourself (DIY) is a contemporary socio-technological phenomenon, enabled by technological tools. The nature and potential long-term effects of this phenomenon have been widely studied within the framework of the EU funded project ‘Digital Do It Yourself’, in which the authors have created and experimented a specific Digital Do It Yourself (DiDIY) co-design process. The phenomenon was first studied through a literature research to understand its multiple dimensions and complexity. Therefore, co-design workshops were used to investigate the phenomenon by involving people to achieve a complete understanding of the DiDIY practices and its enabling factors. These analyses allowed the definition of the DiDIY fundamental factors that were then translated into a design tool. The objective of the tool is to shape design concepts by transferring these factors into different environments to achieve innovation. The aim of this paper is to present the ‘DiDIY Factor Stimuli’ tool, describing the research path and the findings behind it.

Keywords: co-design process, digital DIY, innovation, toolkit

Procedia PDF Downloads 157
15956 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes

Authors: Igor A. Krichtafovitch

Abstract:

The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.

Keywords: supercomputer, biological evolution, Darwinism, speciation

Procedia PDF Downloads 143
15955 Gender Based Variability Time Series Complexity Analysis

Authors: Ramesh K. Sunkaria, Puneeta Marwaha

Abstract:

Nonlinear methods of heart rate variability (HRV) analysis are becoming more popular. It has been observed that complexity measures quantify the regularity and uncertainty of cardiovascular RR-interval time series. In the present work, SampEn has been evaluated in healthy Normal Sinus Rhythm (NSR) male and female subjects for different data lengths and tolerance level r. It is demonstrated that SampEn is small for higher values of tolerance r. Also SampEn value of healthy female group is higher than that of healthy male group for short data length and with increase in data length both groups overlap each other and it is difficult to distinguish them. The SampEn gives inaccurate results by assigning higher value to female group, because male subject have more complex HRV pattern than that of female subjects. Therefore, this traditional algorithm exhibits higher complexity for healthy female subjects than for healthy male subjects, which is misleading observation. This may be due to the fact that SampEn do not account for multiple time scales inherent in the physiologic time series and the hidden spatial and temporal fluctuations remains unexplored.

Keywords: heart rate variability, normal sinus rhythm group, RR interval time series, sample entropy

Procedia PDF Downloads 264