Search results for: computational complexity theory
7922 Identifying Chaotic Architecture: Origins of Nonlinear Design Theory
Authors: Mohammadsadegh Zanganehfar
Abstract:
Since the modernism, movement, and appearance of modern architecture, an aggressive desire for a general design theory in the theoretical works of architects in the form of books and essays emerges. Since Robert Venturi and Denise Scott Brown’s published complexity and contradiction in architecture in 1966, the discourse of complexity and volumetric composition has been an important and controversial issue in the discipline. Ever since various theories and essays were involved in this discourse, this paper attempt to identify chaos theory as a scientific model of complexity and its relation to architecture design theory by conducting a qualitative analysis and multidisciplinary critical approach through architecture and basic sciences resources. As a result, we identify chaotic architecture as the correlation of chaos theory and architecture as an independent nonlinear design theory with specific characteristics and properties.Keywords: architecture complexity, chaos theory, fractals, nonlinear dynamic systems, nonlinear ontology
Procedia PDF Downloads 3747921 A Subband BSS Structure with Reduced Complexity and Fast Convergence
Authors: Salah Al-Din I. Badran, Samad Ahmadi, Ismail Shahin
Abstract:
A blind source separation method is proposed; in this method, we use a non-uniform filter bank and a novel normalisation. This method provides a reduced computational complexity and increased convergence speed comparing to the full-band algorithm. Recently, adaptive sub-band scheme has been recommended to solve two problems: reduction of computational complexity and increase the convergence speed of the adaptive algorithm for correlated input signals. In this work, the reduction in computational complexity is achieved with the use of adaptive filters of orders less than the full-band adaptive filters, which operate at a sampling rate lower than the sampling rate of the input signal. The decomposed signals by analysis bank filter are less correlated in each subband than the input signal at full bandwidth, and can promote better rates of convergence.Keywords: blind source separation, computational complexity, subband, convergence speed, mixture
Procedia PDF Downloads 5807920 Rapid Algorithm for GPS Signal Acquisition
Authors: Fabricio Costa Silva, Samuel Xavier de Souza
Abstract:
A Global Positioning System (GPS) receiver is responsible to determine position, velocity and timing information by using satellite information. To get this information are necessary to combine an incoming and a locally generated signal. The procedure called acquisition need to found two information, the frequency and phase of the incoming signal. This is very time consuming, so there are several techniques to reduces the computational complexity, but each of then put projects issues in conflict. I this papers we present a method that can reduce the computational complexity by reducing the search space and paralleling the search.Keywords: GPS, acquisition, complexity, parallelism
Procedia PDF Downloads 5397919 Rapid Parallel Algorithm for GPS Signal Acquisition
Authors: Fabricio Costa Silva, Samuel Xavier de Souza
Abstract:
A Global Positioning System (GPS) receiver is responsible to determine position, velocity and timing information by using satellite information. To get this information's are necessary to combine an incoming and a locally generated signal. The procedure called acquisition need to found two information, the frequency and phase of the incoming signal. This is very time consuming, so there are several techniques to reduces the computational complexity, but each of then put projects issues in conflict. I this papers we present a method that can reduce the computational complexity by reducing the search space and paralleling the search.Keywords: GPS, acquisition, low complexity, parallelism
Procedia PDF Downloads 5017918 A Fast Convergence Subband BSS Structure
Authors: Salah Al-Din I. Badran, Samad Ahmadi, Ismail Shahin
Abstract:
A blind source separation method is proposed; in this method we use a non-uniform filter bank and a novel normalisation. This method provides a reduced computational complexity and increased convergence speed comparing to the full-band algorithm. Recently, adaptive sub-band scheme has been recommended to solve two problems: reduction of computational complexity and increase the convergence speed of the adaptive algorithm for correlated input signals. In this work the reduction in computational complexity is achieved with the use of adaptive filters of orders less than the full-band adaptive filters, which operate at a sampling rate lower than the sampling rate of the input signal. The decomposed signals by analysis bank filter are less correlated in each sub-band than the input signal at full bandwidth, and can promote better rates of convergence.Keywords: blind source separation, computational complexity, subband, convergence speed, mixture
Procedia PDF Downloads 5557917 Exploring Leadership Adaptability in the Private Healthcare Organizations in the UK in Times of Crises
Authors: Sade Ogundipe
Abstract:
The private healthcare sector in the United Kingdom has experienced unprecedented challenges during times of crisis, necessitating effective leadership adaptability. This qualitative study delves into the dynamic landscape of leadership within the sector, particularly during crises, employing the lenses of complexity theory and institutional theory to unravel the intricate mechanisms at play. Through in-depth interviews with 25 various levels of leaders in the UK private healthcare sector, this research explores how leaders in UK private healthcare organizations navigate complex and often chaotic environments, shedding light on their adaptive strategies and decision-making processes during crises. Complexity theory is used to analyze the complicated, volatile nature of healthcare crises, emphasizing the need for adaptive leadership in such contexts. Institutional theory, on the other hand, provides insights into how external and internal institutional pressures influence leadership behavior. Findings from this study highlight the multifaceted nature of leadership adaptability, emphasizing the significance of leaders' abilities to embrace uncertainty, engage in sensemaking, and leverage the institutional environment to enact meaningful changes. Furthermore, this research sheds light on the challenges and opportunities that leaders face when adapting to crises within the UK private healthcare sector. The study's insights contribute to the growing body of literature on leadership in healthcare, offering practical implications for leaders, policymakers, and stakeholders within the UK private healthcare sector. By employing the dual perspectives of complexity theory and institutional theory, this research provides a holistic understanding of leadership adaptability in the face of crises, offering valuable guidance for enhancing the resilience and effectiveness of healthcare leadership within this vital sector.Keywords: leadership, adaptability, decision-making, complexity, complexity theory, institutional theory, organizational complexity, complex adaptive system (CAS), crises, healthcare
Procedia PDF Downloads 517916 Fast and Efficient Algorithms for Evaluating Uniform and Nonuniform Lagrange and Newton Curves
Authors: Taweechai Nuntawisuttiwong, Natasha Dejdumrong
Abstract:
Newton-Lagrange Interpolations are widely used in numerical analysis. However, it requires a quadratic computational time for their constructions. In computer aided geometric design (CAGD), there are some polynomial curves: Wang-Ball, DP and Dejdumrong curves, which have linear time complexity algorithms. Thus, the computational time for Newton-Lagrange Interpolations can be reduced by applying the algorithms of Wang-Ball, DP and Dejdumrong curves. In order to use Wang-Ball, DP and Dejdumrong algorithms, first, it is necessary to convert Newton-Lagrange polynomials into Wang-Ball, DP or Dejdumrong polynomials. In this work, the algorithms for converting from both uniform and non-uniform Newton-Lagrange polynomials into Wang-Ball, DP and Dejdumrong polynomials are investigated. Thus, the computational time for representing Newton-Lagrange polynomials can be reduced into linear complexity. In addition, the other utilizations of using CAGD curves to modify the Newton-Lagrange curves can be taken.Keywords: Lagrange interpolation, linear complexity, monomial matrix, Newton interpolation
Procedia PDF Downloads 2347915 Analysis of Cardiac Health Using Chaotic Theory
Authors: Chandra Mukherjee
Abstract:
The prevalent knowledge of the biological systems is based on the standard scientific perception of natural equilibrium, determination and predictability. Recently, a rethinking of concepts was presented and a new scientific perspective emerged that involves complexity theory with deterministic chaos theory, nonlinear dynamics and theory of fractals. The unpredictability of the chaotic processes probably would change our understanding of diseases and their management. The mathematical definition of chaos is defined by deterministic behavior with irregular patterns that obey mathematical equations which are critically dependent on initial conditions. The chaos theory is the branch of sciences with an interest in nonlinear dynamics, fractals, bifurcations, periodic oscillations and complexity. Recently, the biomedical interest for this scientific field made these mathematical concepts available to medical researchers and practitioners. Any biological network system is considered to have a nominal state, which is recognized as a homeostatic state. In reality, the different physiological systems are not under normal conditions in a stable state of homeostatic balance, but they are in a dynamically stable state with a chaotic behavior and complexity. Biological systems like heart rhythm and brain electrical activity are dynamical systems that can be classified as chaotic systems with sensitive dependence on initial conditions. In biological systems, the state of a disease is characterized by a loss of the complexity and chaotic behavior, and by the presence of pathological periodicity and regulatory behavior. The failure or the collapse of nonlinear dynamics is an indication of disease rather than a characteristic of health.Keywords: HRV, HRVI, LF, HF, DII
Procedia PDF Downloads 4267914 Development of the Ontology of Engineering Design Complexity
Authors: Victor E. Lopez, L. Dale Thomas
Abstract:
As engineered systems become more complex, the difficulty associated with predicting, developing, and operating engineered systems also increases, resulting in increased costs, failure rates, and unexpected consequences. Successfully managing the complexity of the system should reduce these negative consequences. The study of complexity in the context of engineering development has suffered due to the ambiguity of the nature of complexity, what makes a system complex and how complexity translates to real world engineering attributes and consequences. This paper argues that the use of an ontology of engineering design complexity would i) improve the clarity of the research being performed by allowing researchers to use a common conceptualization of complexity, with more precise terminology, and ii) elucidate the connections between certain types of complexity and their consequences for system development. The ontology comprises concepts of complexity found in the literature and the different relations that exists between them. The ontology maps different complexity concepts such as structural complexity, creation complexity, and information entropy, and then relates the to system aspects such as interfaces, development effort, and modularity. The ontology is represented using the Web Ontology Language (OWL). This paper presents the current status of the ontology of engineering design complexity, the main challenges encountered, and the future plans for the ontology.Keywords: design complexity, ontology, design effort, complexity ontology
Procedia PDF Downloads 1877913 Production Line Layout Planning Based on Complexity Measurement
Authors: Guoliang Fan, Aiping Li, Nan Xie, Liyun Xu, Xuemei Liu
Abstract:
Mass customization production increases the difficulty of the production line layout planning. The material distribution process for variety of parts is very complex, which greatly increases the cost of material handling and logistics. In response to this problem, this paper presents an approach of production line layout planning based on complexity measurement. Firstly, by analyzing the influencing factors of equipment layout, the complexity model of production line is established by using information entropy theory. Then, the cost of the part logistics is derived considering different variety of parts. Furthermore, the function of optimization including two objectives of the lowest cost, and the least configuration complexity is built. Finally, the validity of the function is verified in a case study. The results show that the proposed approach may find the layout scheme with the lowest logistics cost and the least complexity. Optimized production line layout planning can effectively improve production efficiency and equipment utilization with lowest cost and complexity.Keywords: production line, layout planning, complexity measurement, optimization, mass customization
Procedia PDF Downloads 3937912 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models
Authors: Salah Alrabeei, Mohammad Yousuf
Abstract:
The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.Keywords: integral differential equations, jump–diffusion model, American options, rational approximation
Procedia PDF Downloads 1207911 A Study on Game Theory Approaches for Wireless Sensor Networks
Authors: M. Shoukath Ali, Rajendra Prasad Singh
Abstract:
Game Theory approaches and their application in improving the performance of Wireless Sensor Networks (WSNs) are discussed in this paper. The mathematical modeling and analysis of WSNs may have low success rate due to the complexity of topology, modeling, link quality, etc. However, Game Theory is a field, which can efficiently use to analyze the WSNs. Game Theory is related to applied mathematics that describes and analyzes interactive decision situations. Game theory has the ability to model independent, individual decision makers whose actions affect the surrounding decision makers. The outcome of complex interactions among rational entities can be predicted by a set of analytical tools. However, the rationality demands a stringent observance to a strategy based on measured of perceived results. Researchers are adopting game theory approaches to model and analyze leading wireless communication networking issues, which includes QoS, power control, resource sharing, etc.Keywords: wireless sensor network, game theory, cooperative game theory, non-cooperative game theory
Procedia PDF Downloads 4347910 Configuring Systems to Be Viable in a Crisis: The Role of Intuitive Decision-Making
Authors: Ayham Fattoum, Simos Chari, Duncan Shaw
Abstract:
Volatile, uncertain, complex, and ambiguous (VUCA) conditions threaten systems viability with emerging and novel events requiring immediate and localized responses. Such responsiveness is only possible through devolved freedom and emancipated decision-making. The Viable System Model (VSM) recognizes the need and suggests maximizing autonomy to localize decision-making and minimize residual complexity. However, exercising delegated autonomy in VUCA requires confidence and knowledge to use intuition and guidance to maintain systemic coherence. This paper explores the role of intuition as an enabler of emancipated decision-making and autonomy under VUCA. Intuition allows decision-makers to use their knowledge and experience to respond rapidly to novel events. This paper offers three contributions to VSM. First, it designs a system model that illustrates the role of intuitive decision-making in managing complexity and maintaining viability. Second, it takes a black-box approach to theory development in VSM to model the role of autonomy and intuition. Third, the study uses a multi-stage discovery-oriented approach (DOA) to develop theory, with each stage combining literature, data analysis, and model/theory development and identifying further questions for the subsequent stage. We synthesize literature (e.g., VSM, complexity management) with seven months of field-based insights (interviews, workshops, and observation of a live disaster exercise) to develop a framework of intuitive complexity management framework and VSM models. The results have practical implications for enhancing the resilience of organizations and communities.Keywords: Intuition, complexity management, decision-making, viable system model
Procedia PDF Downloads 677909 A Comparative Study of High Order Rotated Group Iterative Schemes on Helmholtz Equation
Authors: Norhashidah Hj. Mohd Ali, Teng Wai Ping
Abstract:
In this paper, we present a high order group explicit method in solving the two dimensional Helmholtz equation. The presented method is derived from a nine-point fourth order finite difference approximation formula obtained from a 45-degree rotation of the standard grid which makes it possible for the construction of iterative procedure with reduced complexity. The developed method will be compared with the existing group iterative schemes available in literature in terms of computational time, iteration counts, and computational complexity. The comparative performances of the methods will be discussed and reported.Keywords: explicit group method, finite difference, helmholtz equation, rotated grid, standard grid
Procedia PDF Downloads 4567908 Modified RSA in Mobile Communication
Authors: Nagaratna Rajur, J. D. Mallapur, Y. B. Kirankumar
Abstract:
The security in mobile communication is very different from the internet or telecommunication, because of its poor user interface and limited processing capacity, as well as combination of complex network protocols. Hence, it poses a challenge for less memory usage and low computation speed based security system. Security involves all the activities that are undertaken to protect the value and on-going usability of assets and the integrity and continuity of operations. An effective network security strategies requires identifying threats and then choosing the most effective set of tools to combat them. Cryptography is a simple and efficient way to provide security in communication. RSA is an asymmetric key approach that is highly reliable and widely used in internet communication. However, it has not been efficiently implemented in mobile communication due its computational complexity and large memory utilization. The proposed algorithm modifies the current RSA to be useful in mobile communication by reducing its computational complexity and memory utilization.Keywords: M-RSA, sensor networks, sensor applications, security
Procedia PDF Downloads 3427907 The Biosphere as a Supercomputer Directing and Controlling Evolutionary Processes
Authors: Igor A. Krichtafovitch
Abstract:
The evolutionary processes are not linear. Long periods of quiet and slow development turn to rather rapid emergences of new species and even phyla. During Cambrian explosion, 22 new phyla were added to the previously existed 3 phyla. Contrary to the common credence the natural selection or a survival of the fittest cannot be accounted for the dominant evolution vector which is steady and accelerated advent of more complex and more intelligent living organisms. Neither Darwinism nor alternative concepts including panspermia and intelligent design propose a satisfactory solution for these phenomena. The proposed hypothesis offers a logical and plausible explanation of the evolutionary processes in general. It is based on two postulates: a) the Biosphere is a single living organism, all parts of which are interconnected, and b) the Biosphere acts as a giant biological supercomputer, storing and processing the information in digital and analog forms. Such supercomputer surpasses all human-made computers by many orders of magnitude. Living organisms are the product of intelligent creative action of the biosphere supercomputer. The biological evolution is driven by growing amount of information stored in the living organisms and increasing complexity of the biosphere as a single organism. Main evolutionary vector is not a survival of the fittest but an accelerated growth of the computational complexity of the living organisms. The following postulates may summarize the proposed hypothesis: biological evolution as a natural life origin and development is a reality. Evolution is a coordinated and controlled process. One of evolution’s main development vectors is a growing computational complexity of the living organisms and the biosphere’s intelligence. The intelligent matter which conducts and controls global evolution is a gigantic bio-computer combining all living organisms on Earth. The information is acting like a software stored in and controlled by the biosphere. Random mutations trigger this software, as is stipulated by Darwinian Evolution Theories, and it is further stimulated by the growing demand for the Biosphere’s global memory storage and computational complexity. Greater memory volume requires a greater number and more intellectually advanced organisms for storing and handling it. More intricate organisms require the greater computational complexity of biosphere in order to keep control over the living world. This is an endless recursive endeavor with accelerated evolutionary dynamic. New species emerge when two conditions are met: a) crucial environmental changes occur and/or global memory storage volume comes to its limit and b) biosphere computational complexity reaches critical mass capable of producing more advanced creatures. The hypothesis presented here is a naturalistic concept of life creation and evolution. The hypothesis logically resolves many puzzling problems with the current state evolution theory such as speciation, as a result of GM purposeful design, evolution development vector, as a need for growing global intelligence, punctuated equilibrium, happening when two above conditions a) and b) are met, the Cambrian explosion, mass extinctions, happening when more intelligent species should replace outdated creatures.Keywords: supercomputer, biological evolution, Darwinism, speciation
Procedia PDF Downloads 1657906 A Look at the Quantum Theory of Atoms in Molecules from the Discrete Morse Theory
Authors: Dairo Jose Hernandez Paez
Abstract:
The quantum theory of atoms in molecules (QTAIM) allows us to obtain topological information on electronic density in quantum mechanical systems. The QTAIM starts by considering the electron density as a continuous mathematical object. On the other hand, the discretization of electron density is also a mathematical object, which, from discrete mathematics, would allow a new approach to its topological study. From this point of view, it is necessary to develop a series of steps that provide the theoretical support that guarantees its application. Some of the steps that we consider most important are mentioned below: (1) obtain good representations of the electron density through computational calculations, (2) design a methodology for the discretization of electron density, and construct the simplicial complex. (3) Make an analysis of the discrete vector field associating the simplicial complex. (4) Finally, in this research, we propose to use the discrete Morse theory as a mathematical tool to carry out studies of electron density topology.Keywords: discrete mathematics, Discrete Morse theory, electronic density, computational calculations
Procedia PDF Downloads 1047905 Cognitive Weighted Polymorphism Factor: A New Cognitive Complexity Metric
Authors: T. Francis Thamburaj, A. Aloysius
Abstract:
Polymorphism is one of the main pillars of the object-oriented paradigm. It induces hidden forms of class dependencies which may impact software quality, resulting in higher cost factor for comprehending, debugging, testing, and maintaining the software. In this paper, a new cognitive complexity metric called Cognitive Weighted Polymorphism Factor (CWPF) is proposed. Apart from the software structural complexity, it includes the cognitive complexity on the basis of type. The cognitive weights are calibrated based on 27 empirical studies with 120 persons. A case study and experimentation of the new software metric shows positive results. Further, a comparative study is made and the correlation test has proved that CWPF complexity metric is a better, more comprehensive, and more realistic indicator of the software complexity than Abreu’s Polymorphism Factor (PF) complexity metric.Keywords: cognitive complexity metric, object-oriented metrics, polymorphism factor, software metrics
Procedia PDF Downloads 4597904 Low Complexity Deblocking Algorithm
Authors: Jagroop Singh Sidhu, Buta Singh
Abstract:
A low computational deblocking filter including three frequency related modes (smooth mode, intermediate mode, and non-smooth mode for low-frequency, mid-frequency, and high frequency regions, respectively) is proposed. The suggested approach requires zero additions, zero subtractions, zero multiplications (for intermediate region), no divisions (for non-smooth region) and no comparison. The suggested method thus keeps the computation lower and thus suitable for image coding systems based on blocks. Comparison of average number of operations for smooth, non-smooth, intermediate (per pixel vector for each block) using filter suggested by Chen and the proposed method filter suggests that the proposed filter keeps the computation lower and is thus suitable for fast processing algorithms.Keywords: blocking artifacts, computational complexity, non-smooth, intermediate, smooth
Procedia PDF Downloads 4647903 Conceptualising Project Complexity in Ghana’s Construction Industry: A Qualitative Study
Authors: Kwasi Dartey-Baah, Mias De Klerk
Abstract:
Project complexity has been cited as one of the essential areas of project management. It can be observed from environmental, social, technological, and organisational viewpoints, and its handling is critical to project success. Conceptualised in varied industries, this paper seeks to ascertain the meaning and understanding of project complexity within the Ghanaian construction industry based on the three dimensions of complexities (faith, fact, and interaction) using experts' opinions. Taking the form of a focus group discussion, the paper sought to gain an in-depth understanding of project complexity issues in Ghana’s construction industry. The method use obtained data from experts (a purposely selected group) comprising project leaders and project management academics. The findings indicated that the experts broadly agreed with the complexity items but offered varied reasons for their agreement. In the composite assessment of the complexity dimensions of (faith, fact, and interaction), it emerged that there was some agreement with the complexity dimensions of fact and interaction within Ghana’s construction industry. On the other hand, with the dimension for complexity by faith, it was noted that the experts in Ghana’s construction construed complexity by faith, not as the absence of evidence but the evidence that hinges on at least a member of the project team. It is expected that other researches on project complexity will focus on other industries to enhance the knowledge of the same within the field of project management.Keywords: project complexity, complexity by faith, complexity by fact, complexity by interaction, construction industry, Ghana
Procedia PDF Downloads 1617902 Low-Complexity Multiplication Using Complement and Signed-Digit Recoding Methods
Authors: Te-Jen Chang, I-Hui Pan, Ping-Sheng Huang, Shan-Jen Cheng
Abstract:
In this paper, a fast multiplication computing method utilizing the complement representation method and canonical recoding technique is proposed. By performing complements and canonical recoding technique, the number of partial products can be reduced. Based on these techniques, we propose an algorithm that provides an efficient multiplication method. On average, our proposed algorithm is to reduce the number of k-bit additions from (0.25k+logk/k+2.5) to (k/6 +logk/k+2.5), where k is the bit-length of the multiplicand A and multiplier B. We can therefore efficiently speed up the overall performance of the multiplication. Moreover, if we use the new proposes to compute common-multiplicand multiplication, the computational complexity can be reduced from (0.5 k+2 logk/k+5) to (k/3+2 logk/k+5) k-bit additions.Keywords: algorithm design, complexity analysis, canonical recoding, public key cryptography, common-multiplicand multiplication
Procedia PDF Downloads 4357901 Fluid-Structure Interaction Study of Fluid Flow past Marine Turbine Blade Designed by Using Blade Element Theory and Momentum Theory
Authors: Abu Afree Andalib, M. Mezbah Uddin, M. Rafiur Rahman, M. Abir Hossain, Rajia Sultana Kamol
Abstract:
This paper deals with the analysis of flow past the marine turbine blade which is designed by using the blade element theory and momentum theory for the purpose of using in the field of renewable energy. The designed blade is analyzed for various parameters using FSI module of Ansys. Computational Fluid Dynamics is used for the study of fluid flow past the blade and other fluidic phenomena such as lift, drag, pressure differentials, energy dissipation in water. Finite Element Analysis (FEA) module of Ansys was used to analyze the structural parameter such as stress and stress density, localization point, deflection, force propagation. Fine mesh is considered in every case for more accuracy in the result according to computational machine power. The relevance of design, search and optimization with respect to complex fluid flow and structural modeling is considered and analyzed. The relevancy of design and optimization with respect to complex fluid for minimum drag force using Ansys Adjoint Solver module is analyzed as well. The graphical comparison of the above-mentioned parameter using CFD and FEA and subsequently FSI technique is illustrated and found the significant conformity between both the results.Keywords: blade element theory, computational fluid dynamics, finite element analysis, fluid-structure interaction, momentum theory
Procedia PDF Downloads 3017900 The Complexity of Testing Cryptographic Devices on Input Faults
Authors: Alisher Ikramov, Gayrat Juraev
Abstract:
The production of logic devices faces the occurrence of faults during manufacturing. This work analyses the complexity of testing a special type of logic device on inverse, adhesion, and constant input faults. The focus of this work is on devices that implement cryptographic functions. The complexity values for the general case faults and for some frequently occurring subsets were determined and proved in this work. For a special case, when the length of the text block is equal to the length of the key block, the complexity of testing is proven to be asymptotically half the complexity of testing all logic devices on the same types of input faults.Keywords: complexity, cryptographic devices, input faults, testing
Procedia PDF Downloads 2267899 Reduced Complexity of ML Detection Combined with DFE
Authors: Jae-Hyun Ro, Yong-Jun Kim, Chang-Bin Ha, Hyoung-Kyu Song
Abstract:
In multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) systems, many detection schemes have been developed to improve the error performance and to reduce the complexity. Maximum likelihood (ML) detection has optimal error performance but it has very high complexity. Thus, this paper proposes reduced complexity of ML detection combined with decision feedback equalizer (DFE). The error performance of the proposed detection scheme is higher than the conventional DFE. But the complexity of the proposed scheme is lower than the conventional ML detection.Keywords: detection, DFE, MIMO-OFDM, ML
Procedia PDF Downloads 6107898 Student Learning and Motivation in an Interculturally Inclusive Classroom
Authors: Jonathan H. Westover, Jacque P. Westover, Maureen S. Andrade
Abstract:
Though learning theories vary in complexity and usefulness, a thorough understanding of foundational learning theories is a necessity in today’s educational environment. Additionally, learning theories lead to approaches in instruction that can affect student motivation and learning. The combination of a learning theory and elements to enhance student motivation can create a learning context where the student can thrive in their educational pursuits. This paper will provide an overview of three main learning theories: (1) Behavioral Theory, (2) Cognitive Theory, and (3) Constructivist Theory and explore their connection to elements of student learning motivation. Finally, we apply these learning theories and elements of student motivation to the following two context: (1) The FastStart Program at the Community College of Denver, and (2) An Online Academic English Language Course. We discussed potential of the program and course to have success in increasing student success outcomes.Keywords: learning theory, student motivation, inclusive pedagogy, developmental education
Procedia PDF Downloads 2567897 Software Engineering Revolution Driven by Complexity Science
Abstract:
This paper introduces a new software engineering paradigm based on complexity science, called NSE (Nonlinear Software Engineering paradigm). The purpose of establishing NSE is to help software development organizations double their productivity, half their cost, and increase the quality of their products in several orders of magnitude simultaneously. NSE complies with the essential principles of complexity science. NSE brings revolutionary changes to almost all aspects in software engineering. NSE has been fully implemented with its support platform Panorama++.Keywords: complexity science, software development, software engineering, software maintenance
Procedia PDF Downloads 2657896 Visualizing the Commercial Activity of a City by Analyzing the Data Information in Layers
Authors: Taras Agryzkov, Jose L. Oliver, Leandro Tortosa, Jose Vicent
Abstract:
This paper aims to demonstrate how network models can be used to understand and to deal with some aspects of urban complexity. As it is well known, the Theory of Architecture and Urbanism has been using for decades’ intellectual tools based on the ‘sciences of complexity’ as a strategy to propose theoretical approaches about cities and about architecture. In this sense, it is possible to find a vast literature in which for instance network theory is used as an instrument to understand very diverse questions about cities: from their commercial activity to their heritage condition. The contribution of this research consists in adding one step of complexity to this process: instead of working with one single primal graph as it is usually done, we will show how new network models arise from the consideration of two different primal graphs interacting in two layers. When we model an urban network through a mathematical structure like a graph, the city is usually represented by a set of nodes and edges that reproduce its topology, with the data generated or extracted from the city embedded in it. All this information is normally displayed in a single layer. Here, we propose to separate the information in two layers so that we can evaluate the interaction between them. Besides, both layers may be composed of structures that do not have to coincide: from this bi-layer system, groups of interactions emerge, suggesting reflections and in consequence, possible actions.Keywords: graphs, mathematics, networks, urban studies
Procedia PDF Downloads 1817895 Hardware Implementation and Real-time Experimental Validation of a Direction of Arrival Estimation Algorithm
Authors: Nizar Tayem, AbuMuhammad Moinuddeen, Ahmed A. Hussain, Redha M. Radaydeh
Abstract:
This research paper introduces an approach for estimating the direction of arrival (DOA) of multiple RF noncoherent sources in a uniform linear array (ULA). The proposed method utilizes a Capon-like estimation algorithm and incorporates LU decomposition to enhance the accuracy of DOA estimation while significantly reducing computational complexity compared to existing methods like the Capon method. Notably, the proposed method does not require prior knowledge of the number of sources. To validate its effectiveness, the proposed method undergoes validation through both software simulations and practical experimentation on a prototype testbed constructed using a software-defined radio (SDR) platform and GNU Radio software. The results obtained from MATLAB simulations and real-time experiments provide compelling evidence of the proposed method's efficacy.Keywords: DOA estimation, real-time validation, software defined radio, computational complexity, Capon's method, GNU radio
Procedia PDF Downloads 757894 Taleb's Complexity Theory Concept of 'Antifragility' Has a Significant Contribution to Make to Positive Psychology as Applied to Wellbeing
Authors: Claudius Peter Van Wyk
Abstract:
Given the increasingly manifest phenomena, as described in complexity theory, of volatility, uncertainty, complexity and ambiguity (VUCA), Taleb's notion of 'antifragility, has a significant contribution to make to positive psychology applied to wellbeing. Antifragility is argued to be fundamentally different from the concepts of resiliency; as the ability to recover from failure, and robustness; as the ability to resist failure. Rather it describes the capacity to reorganise in the face of stress in such a way as to cope more effectively with systemic challenges. The concept, which has been applied in disciplines ranging from physics, molecular biology, planning, engineering, and computer science, can now be considered for its application in individual human and social wellbeing. There are strong correlations to Antonovsky's model of 'salutogenesis' in which an attitude and competencies are developed of transforming burdening factors into greater resourcefulness. We demonstrate, from the perspective of neuroscience, how technology measuring nervous system coherence can be coupled to acquired psychodynamic approaches to not only identify contextual stressors, utilise biofeedback instruments for facilitating greater coherence, but apply these insights to specific life stressors that compromise well-being. Employing an on-going case study with BMW South Africa, the neurological mapping is demonstrated together with 'reframing' and emotional anchoring techniques from neurolinguistic programming. The argument is contextualised in the discipline of psychoneuroimmunology which describes the stress pathways from the CNS and endocrine systems and their impact on immune function and the capacity to restore homeostasis.Keywords: antifragility, complexity, neuroscience, psychoneuroimmunology, salutogenesis, volatility
Procedia PDF Downloads 3767893 Examining the Development of Complexity, Accuracy and Fluency in L2 Learners' Writing after L2 Instruction
Authors: Khaled Barkaoui
Abstract:
Research on second-language (L2) learning tends to focus on comparing students with different levels of proficiency at one point in time. However, to understand L2 development, we need more longitudinal research. In this study, we adopt a longitudinal approach to examine changes in three indicators of L2 ability, complexity, accuracy, and fluency (CAF), as reflected in the writing of L2 learners when writing on different tasks before and after a period L2 instruction. Each of 85 Chinese learners of English at three levels of English language proficiency responded to two writing tasks (independent and integrated) before and after nine months of English-language study in China. Each essay (N= 276) was analyzed in terms of numerous CAF indices using both computer coding and human rating: number of words written, number of errors per 100 words, ratings of error severity, global syntactic complexity (MLS), complexity by coordination (T/S), complexity by subordination (C/T), clausal complexity (MLC), phrasal complexity (NP density), syntactic variety, lexical density, lexical variation, lexical sophistication, and lexical bundles. Results were then compared statistically across tasks, L2 proficiency levels, and time. Overall, task type had significant effects on fluency and some syntactic complexity indices (complexity by coordination, structural variety, clausal complexity, phrase complexity) and lexical density, sophistication, and bundles, but not accuracy. L2 proficiency had significant effects on fluency, accuracy, and lexical variation, but not syntactic complexity. Finally, fluency, frequency of errors, but not accuracy ratings, syntactic complexity indices (clausal complexity, global complexity, complexity by subordination, phrase complexity, structural variety) and lexical complexity (lexical density, variation, and sophistication) exhibited significant changes after instruction, particularly for the independent task. We discuss the findings and their implications for assessment, instruction, and research on CAF in the context of L2 writing.Keywords: second language writing, Fluency, accuracy, complexity, longitudinal
Procedia PDF Downloads 153