Search results for: time complexity measurements
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 20892

Search results for: time complexity measurements

20862 Latitudinal Patterns of Pre-industrial Human Cultural Diversity and Societal Complexity

Authors: Xin Chen

Abstract:

Pre-industrial old-world human cultural diversity and societal complexity exhibits remarkable geographic regularities. Along the latitudinal axis from the equator to the arctic, a descending trend of human ethno-cultural diversity is found to be in coincidence with a descending trend of biological diversity. Along the same latitudinal axis, the pre-industrial human societal complexity shows to peak at the intermediate latitude. It is postulated that human cultural diversity and societal complexity are strongly influenced by collective learning, and that collective learning is positively related to human population size, social interactions, and environmental challenges. Under such postulations the relationship between collective learning and important geographical-environmental factors, including climate and biodiversity/bio-productivity is examined. A hypothesis of intermediate bio-productivity is formulated to account for those latitudinal patterns of pre-industrial human societal complexity.

Keywords: cultural diversity, soetal complexity, latitudinal patterns, biodiversity, bio-productivity, collective learning

Procedia PDF Downloads 64
20861 A FE-Based Scheme for Computing Wave Interaction with Nonlinear Damage and Generation of Harmonics in Layered Composite Structures

Authors: R. K. Apalowo, D. Chronopoulos

Abstract:

A Finite Element (FE) based scheme is presented for quantifying guided wave interaction with Localised Nonlinear Structural Damage (LNSD) within structures of arbitrary layering and geometric complexity. The through-thickness mode-shape of the structure is obtained through a wave and finite element method. This is applied in a time domain FE simulation in order to generate time harmonic excitation for a specific wave mode. Interaction of the wave with LNSD within the system is computed through an element activation and deactivation iteration. The scheme is validated against experimental measurements and a WFE-FE methodology for calculating wave interaction with damage. Case studies for guided wave interaction with crack and delamination are presented to verify the robustness of the proposed method in classifying and identifying damage.

Keywords: layered structures, nonlinear ultrasound, wave interaction with nonlinear damage, wave finite element, finite element

Procedia PDF Downloads 139
20860 The Use of Boosted Multivariate Trees in Medical Decision-Making for Repeated Measurements

Authors: Ebru Turgal, Beyza Doganay Erdogan

Abstract:

Machine learning aims to model the relationship between the response and features. Medical decision-making researchers would like to make decisions about patients’ course and treatment, by examining the repeated measurements over time. Boosting approach is now being used in machine learning area for these aims as an influential tool. The aim of this study is to show the usage of multivariate tree boosting in this field. The main reason for utilizing this approach in the field of decision-making is the ease solutions of complex relationships. To show how multivariate tree boosting method can be used to identify important features and feature-time interaction, we used the data, which was collected retrospectively from Ankara University Chest Diseases Department records. Dataset includes repeated PF ratio measurements. The follow-up time is planned for 120 hours. A set of different models is tested. In conclusion, main idea of classification with weighed combination of classifiers is a reliable method which was shown with simulations several times. Furthermore, time varying variables will be taken into consideration within this concept and it could be possible to make accurate decisions about regression and survival problems.

Keywords: boosted multivariate trees, longitudinal data, multivariate regression tree, panel data

Procedia PDF Downloads 187
20859 Comparison of Petrophysical Relationship for Soil Water Content Estimation at Peat Soil Area Using GPR Common-Offset Measurements

Authors: Nurul Izzati Abd Karim, Samira Albati Kamaruddin, Rozaimi Che Hasan

Abstract:

The appropriate petrophysical relationship is needed for Soil Water Content (SWC) estimation especially when using Ground Penetrating Radar (GPR). Ground penetrating radar is a geophysical tool that provides indirectly the parameter of SWC. This paper examines the performance of few published petrophysical relationships to obtain SWC estimates from in-situ GPR common- offset survey measurements with gravimetric measurements at peat soil area. Gravimetric measurements were conducted to support of GPR measurements for the accuracy assessment. Further, GPR with dual frequencies (250MHhz and 700MHz) were used in the survey measurements to obtain the dielectric permittivity. Three empirical equations (i.e., Roth’s equation, Schaap’s equation and Idi’s equation) were selected for the study, used to compute the soil water content from dielectric permittivity of the GPR profile. The results indicate that Schaap’s equation provides strong correlation with SWC as measured by GPR data sets and gravimetric measurements.

Keywords: common-offset measurements, ground penetrating radar, petrophysical relationship, soil water content

Procedia PDF Downloads 235
20858 Hardware Implementation and Real-time Experimental Validation of a Direction of Arrival Estimation Algorithm

Authors: Nizar Tayem, AbuMuhammad Moinuddeen, Ahmed A. Hussain, Redha M. Radaydeh

Abstract:

This research paper introduces an approach for estimating the direction of arrival (DOA) of multiple RF noncoherent sources in a uniform linear array (ULA). The proposed method utilizes a Capon-like estimation algorithm and incorporates LU decomposition to enhance the accuracy of DOA estimation while significantly reducing computational complexity compared to existing methods like the Capon method. Notably, the proposed method does not require prior knowledge of the number of sources. To validate its effectiveness, the proposed method undergoes validation through both software simulations and practical experimentation on a prototype testbed constructed using a software-defined radio (SDR) platform and GNU Radio software. The results obtained from MATLAB simulations and real-time experiments provide compelling evidence of the proposed method's efficacy.

Keywords: DOA estimation, real-time validation, software defined radio, computational complexity, Capon's method, GNU radio

Procedia PDF Downloads 55
20857 An Estimating Equation for Survival Data with a Possibly Time-Varying Covariates under a Semiparametric Transformation Models

Authors: Yemane Hailu Fissuh, Zhongzhan Zhang

Abstract:

An estimating equation technique is an alternative method of the widely used maximum likelihood methods, which enables us to ease some complexity due to the complex characteristics of time-varying covariates. In the situations, when both the time-varying covariates and left-truncation are considered in the model, the maximum likelihood estimation procedures become much more burdensome and complex. To ease the complexity, in this study, the modified estimating equations those have been given high attention and considerations in many researchers under semiparametric transformation model was proposed. The purpose of this article was to develop the modified estimating equation under flexible and general class of semiparametric transformation models for left-truncated and right censored survival data with time-varying covariates. Besides the commonly applied Cox proportional hazards model, such kind of problems can be also analyzed with a general class of semiparametric transformation models to estimate the effect of treatment given possibly time-varying covariates on the survival time. The consistency and asymptotic properties of the estimators were intuitively derived via the expectation-maximization (EM) algorithm. The characteristics of the estimators in the finite sample performance for the proposed model were illustrated via simulation studies and Stanford heart transplant real data examples. To sum up the study, the bias for covariates has been adjusted by estimating density function for the truncation time variable. Then the effect of possibly time-varying covariates was evaluated in some special semiparametric transformation models.

Keywords: EM algorithm, estimating equation, semiparametric transformation models, time-to-event outcomes, time varying covariate

Procedia PDF Downloads 141
20856 Measuring the Effectiveness of Response Inhibition regarding to Motor Complexity: Evidence from the Stroop Effect

Authors: Germán Gálvez-García, Marta Lavin, Javiera Peña, Javier Albayay, Claudio Bascour, Jesus Fernandez-Gomez, Alicia Pérez-Gálvez

Abstract:

We studied the effectiveness of response inhibition in movements with different degrees of motor complexity when they were executed in isolation and alternately. Sixteen participants performed the Stroop task which was used as a measure of response inhibition. Participants responded by lifting the index finger and reaching the screen with the same finger. Both actions were performed separately and alternately in different experimental blocks. Repeated measures ANOVAs were used to compare reaction time, movement time, kinematic errors and Movement errors across conditions (experimental block, movement, and congruency). Delta plots were constructed to perform distributional analyses of response inhibition and accuracy rate. The effectiveness of response inhibition did not show difference when the movements were performed in separated blocks. Nevertheless, it showed differences when they were performed alternately in the same experimental block, being more effective for the lifting action. This could be due to a competition of the available resources during a more complex scenario which also demands to adopt some strategy to avoid errors.

Keywords: response inhibition, motor complexity, Stroop task, delta plots

Procedia PDF Downloads 377
20855 Contractual Complexity and Contract Parties' Opportunistic Behavior in Construction Projects: In a Contractual Function View

Authors: Mengxia Jin, Yongqiang Chen, Wenqian Wang, Yu Wang

Abstract:

The complexity and specificity of construction projects have made common opportunism phenomenon, and contractual governance for opportunism has been a topic of considerable ongoing research. Based on TCE, the research distinguishes control and coordination as different functions of the contract to investigate their complexity separately. And in a nuanced way, the dimensionality of contractual control is examined. Through the analysis of motivation and capability of strong or weak form opportunism, the framework focuses on the relationship between the complexity of above contractual dimensions and different types of opportunistic behavior and attempts to verify the possible explanatory mechanism. The explanatory power of the research model is evaluated in the light of empirical evidence from questionnaires. We collect data from Chinese companies in the construction industry, and the data collection is still in progress. The findings will speak to the debate surrounding the effects of contract complexity on opportunistic behavior. This nuanced research will derive implications for research on the role of contractual mechanisms in dealing with inter-organizational opportunism and offer suggestions for curbing contract parties’ opportunistic behavior in construction projects.

Keywords: contractual complexity, contractual control, contractual coordinatio, opportunistic behavior

Procedia PDF Downloads 370
20854 Understanding Complexity at Pre-Construction Stage in Project Planning of Construction Projects

Authors: Mehran Barani Shikhrobat, Roger Flanagan

Abstract:

The construction planning and scheduling based on using the current tools and techniques is resulted deterministic in nature (Gantt chart, CPM) or applying a very little probability of completion (PERT) for each task. However, every project embodies assumptions and influences and should start with a complete set of clearly defined goals and constraints that remain constant throughout the duration of the project. Construction planners continue to apply the traditional methods and tools of “hard” project management that were developed for “ideal projects,” neglecting the potential influence of complexity on the design and construction process. The aim of this research is to investigate the emergence and growth of complexity in project planning and to provide a model to consider the influence of complexity on the total project duration at the post-contract award pre-construction stage of a project. The literature review showed that complexity originates from different sources of environment, technical, and workflow interactions. They can be divided into two categories of complexity factors, first, project tasks, and second, project organisation management. Project tasks may originate from performance, lack of resources, or environmental changes for a specific task. Complexity factors that relate to organisation and management refer to workflow and interdependence of different parts. The literature review highlighted the ineffectiveness of traditional tools and techniques in planning for complexity. However, this research focus on understanding the fundamental causes of the complexity of construction projects were investigated through a questionnaire with industry experts. The results were used to develop a model that considers the core complexity factors and their interactions. System dynamics were used to investigate the model to consider the influence of complexity on project planning. Feedback from experts revealed 20 major complexity factors that impact project planning. The factors are divided into five categories known as core complexity factors. To understand the weight of each factor in comparison, the Analytical Hierarchy Process (AHP) analysis method is used. The comparison showed that externalities are ranked as the biggest influence across the complexity factors. The research underlines that there are many internal and external factors that impact project activities and the project overall. This research shows the importance of considering the influence of complexity on the project master plan undertaken at the post-contract award pre-construction phase of a project.

Keywords: project planning, project complexity measurement, planning uncertainty management, project risk management, strategic project scheduling

Procedia PDF Downloads 91
20853 Heuristic to Generate Random X-Monotone Polygons

Authors: Kamaljit Pati, Manas Kumar Mohanty, Sanjib Sadhu

Abstract:

A heuristic has been designed to generate a random simple monotone polygon from a given set of ‘n’ points lying on a 2-Dimensional plane. Our heuristic generates a random monotone polygon in O(n) time after O(nℓogn) preprocessing time which is improved over the previous work where a random monotone polygon is produced in the same O(n) time but the preprocessing time is O(k) for n < k < n2. However, our heuristic does not generate all possible random polygons with uniform probability. The space complexity of our proposed heuristic is O(n).

Keywords: sorting, monotone polygon, visibility, chain

Procedia PDF Downloads 411
20852 Analysis of Supply Chain Complexity Sub-Dimensions for Garment Industry

Authors: Niyanta Mehra, Aakriti Khurania, Kshitij Rastogi, S. K. Garg

Abstract:

There is plenty of literature available that accounts for complexity management in a supply chain. A major fraction of this literature considers a large number of parameters in order to devise management techniques. However, multiple such parameters do not directly affect the result, and incorporating these can make the analyses overly complicated. Most of the causes of supply chain inefficiencies are due to the interconnectedness and interdependencies in the structure, processes, and environment of the supply chains. The level of complexity varies across industries in terms of intensity and ease of management. After a review of the literature related to complexities in supply chains, the paper attempts to build a framework to study the relative significance of these complexities. This paper aims to identify critical complexities for the garment industry. Understanding and controlling these complexities open avenues for better supply chain management and also assist decision-makers in the garment industry in formulating risk mitigation strategies.

Keywords: complexity dimensions, garment industry, supply chain complexity, supply chain management

Procedia PDF Downloads 129
20851 Performance Complexity Measurement of Tightening Equipment Based on Kolmogorov Entropy

Authors: Guoliang Fan, Aiping Li, Xuemei Liu, Liyun Xu

Abstract:

The performance of the tightening equipment will decline with the working process in manufacturing system. The main manifestations are the randomness and discretization degree increasing of the tightening performance. To evaluate the degradation tendency of the tightening performance accurately, a complexity measurement approach based on Kolmogorov entropy is presented. At first, the states of performance index are divided for calibrating the discrete degree. Then the complexity measurement model based on Kolmogorov entropy is built. The model describes the performance degradation tendency of tightening equipment quantitatively. At last, a study case is applied for verifying the efficiency and validity of the approach. The research achievement shows that the presented complexity measurement can effectively evaluate the degradation tendency of the tightening equipment. It can provide theoretical basis for preventive maintenance and life prediction of equipment.

Keywords: complexity measurement, Kolmogorov entropy, manufacturing system, performance evaluation, tightening equipment

Procedia PDF Downloads 248
20850 Complex Decision Rules in the Form of Decision Trees

Authors: Avinash S. Jagtap, Sharad D. Gore, Rajendra G. Gurao

Abstract:

Decision rules become more and more complex as the number of conditions increase. As a consequence, the complexity of the decision rule also influences the time complexity of computer implementation of such a rule. Consider, for example, a decision that depends on four conditions A, B, C and D. For simplicity, suppose each of these four conditions is binary. Even then the decision rule will consist of 16 lines, where each line will be of the form: If A and B and C and D, then action 1. If A and B and C but not D, then action 2 and so on. While executing this decision rule, each of the four conditions will be checked every time until all the four conditions in a line are satisfied. The minimum number of logical comparisons is 4 whereas the maximum number is 64. This paper proposes to present a complex decision rule in the form of a decision tree. A decision tree divides the cases into branches every time a condition is checked. In the form of a decision tree, every branching eliminates half of the cases that do not satisfy the related conditions. As a result, every branch of the decision tree involves only four logical comparisons and hence is significantly simpler than the corresponding complex decision rule. The conclusion of this paper is that every complex decision rule can be represented as a decision tree and the decision tree is mathematically equivalent but computationally much simpler than the original complex decision rule

Keywords: strategic, tactical, operational, adaptive, innovative

Procedia PDF Downloads 262
20849 Towards a Simulation Model to Ensure the Availability of Machines in Maintenance Activities

Authors: Maryam Gallab, Hafida Bouloiz, Youness Chater, Mohamed Tkiouat

Abstract:

The aim of this paper is to present a model based on multi-agent systems in order to manage the maintenance activities and to ensure the reliability and availability of machines just with the required resources (operators, tools). The interest of the simulation is to solve the complexity of the system and to find results without cost or wasting time. An implementation of the model is carried out on the AnyLogic platform to display the defined performance indicators.

Keywords: maintenance, complexity, simulation, multi-agent systems, AnyLogic platform

Procedia PDF Downloads 287
20848 Environmental Radioactivity Analysis by a Sequential Approach

Authors: G. Medkour Ishak-Boushaki, A. Taibi, M. Allab

Abstract:

Quantitative environmental radioactivity measurements are needed to determine the level of exposure of a population to ionizing radiations and for the assessment of the associated risks. Gamma spectrometry remains a very powerful tool for the analysis of radionuclides present in an environmental sample but the basic problem in such measurements is the low rate of detected events. Using large environmental samples could help to get around this difficulty but, unfortunately, new issues are raised by gamma rays attenuation and self-absorption. Recently, a new method has been suggested, to detect and identify without quantification, in a short time, a gamma ray of a low count source. This method does not require, as usually adopted in gamma spectrometry measurements, a pulse height spectrum acquisition. It is based on a chronological record of each detected photon by simultaneous measurements of its energy ε and its arrival time τ on the detector, the pair parameters [ε,τ] defining an event mode sequence (EMS). The EMS serials are analyzed sequentially by a Bayesian approach to detect the presence of a given radioactive source. The main object of the present work is to test the applicability of this sequential approach in radioactive environmental materials detection. Moreover, for an appropriate health oversight of the public and of the concerned workers, the analysis has been extended to get a reliable quantification of the radionuclides present in environmental samples. For illustration, we consider as an example, the problem of detection and quantification of 238U. Monte Carlo simulated experience is carried out consisting in the detection, by a Ge(Hp) semiconductor junction, of gamma rays of 63 keV emitted by 234Th (progeny of 238U). The generated EMS serials are analyzed by a Bayesian inference. The application of the sequential Bayesian approach, in environmental radioactivity analysis, offers the possibility of reducing the measurements time without requiring large environmental samples and consequently avoids the attached inconvenient. The work is still in progress.

Keywords: Bayesian approach, event mode sequence, gamma spectrometry, Monte Carlo method

Procedia PDF Downloads 482
20847 Holographic Visualisation of 3D Point Clouds in Real-time Measurements: A Proof of Concept Study

Authors: Henrique Fernandes, Sofia Catalucci, Richard Leach, Kapil Sugand

Abstract:

Background: Holograms are 3D images formed by the interference of light beams from a laser or other coherent light source. Pepper’s ghost is a form of hologram conceptualised in the 18th century. This Holographic visualisation with metrology measuring techniques by displaying measurements taken in real-time in holographic form can assist in research and education. New structural designs such as the Plexiglass Stand and the Hologram Box can optimise the holographic experience. Method: The equipment used included: (i) Zeiss’s ATOS Core 300 optical coordinate measuring instrument that scanned real-world objects; (ii) Cloud Compare, open-source software used for point cloud processing; and (iii) Hologram Box, designed and manufactured during this research to provide the blackout environment needed to display 3D point clouds in real-time measurements in holographic format, in addition to a portability aspect to holograms. The equipment was tailored to realise the goal of displaying measurements in an innovative technique and to improve on conventional methods. Three test scans were completed before doing a holographic conversion. Results: The outcome was a precise recreation of the original object in the holographic form presented with dense point clouds and surface density features in a colour map. Conclusion: This work establishes a way to visualise data in a point cloud system. To our understanding, this is a work that has never been attempted. This achievement provides an advancement in holographic visualisation. The Hologram Box could be used as a feedback tool for measurement quality control and verification in future smart factories.

Keywords: holography, 3D scans, hologram box, metrology, point cloud

Procedia PDF Downloads 73
20846 Identifying Chaotic Architecture: Origins of Nonlinear Design Theory

Authors: Mohammadsadegh Zanganehfar

Abstract:

Since the modernism, movement, and appearance of modern architecture, an aggressive desire for a general design theory in the theoretical works of architects in the form of books and essays emerges. Since Robert Venturi and Denise Scott Brown’s published complexity and contradiction in architecture in 1966, the discourse of complexity and volumetric composition has been an important and controversial issue in the discipline. Ever since various theories and essays were involved in this discourse, this paper attempt to identify chaos theory as a scientific model of complexity and its relation to architecture design theory by conducting a qualitative analysis and multidisciplinary critical approach through architecture and basic sciences resources. As a result, we identify chaotic architecture as the correlation of chaos theory and architecture as an independent nonlinear design theory with specific characteristics and properties.

Keywords: architecture complexity, chaos theory, fractals, nonlinear dynamic systems, nonlinear ontology

Procedia PDF Downloads 356
20845 A Subband BSS Structure with Reduced Complexity and Fast Convergence

Authors: Salah Al-Din I. Badran, Samad Ahmadi, Ismail Shahin

Abstract:

A blind source separation method is proposed; in this method, we use a non-uniform filter bank and a novel normalisation. This method provides a reduced computational complexity and increased convergence speed comparing to the full-band algorithm. Recently, adaptive sub-band scheme has been recommended to solve two problems: reduction of computational complexity and increase the convergence speed of the adaptive algorithm for correlated input signals. In this work, the reduction in computational complexity is achieved with the use of adaptive filters of orders less than the full-band adaptive filters, which operate at a sampling rate lower than the sampling rate of the input signal. The decomposed signals by analysis bank filter are less correlated in each subband than the input signal at full bandwidth, and can promote better rates of convergence.

Keywords: blind source separation, computational complexity, subband, convergence speed, mixture

Procedia PDF Downloads 560
20844 Time Synchronization between the eNBs in E-UTRAN under the Asymmetric IP Network

Authors: M. Kollar, A. Zieba

Abstract:

In this paper, we present a method for a time synchronization between the two eNodeBs (eNBs) in E-UTRAN (Evolved Universal Terrestrial Radio Access) network. The two eNBs are cooperating in so-called inter eNB CA (Carrier Aggregation) case and connected via asymmetrical IP network. We solve the problem by using broadcasting signals generated in E-UTRAN as synchronization signals. The results show that the time synchronization with the proposed method is possible with the error significantly less than 1 ms which is sufficient considering the time transmission interval is 1 ms in E-UTRAN. This makes this method (with low complexity) more suitable than Network Time Protocol (NTP) in the mobile applications with generated broadcasting signals where time synchronization in asymmetrical network is required.

Keywords: IP scheduled throughput, E-UTRAN, Evolved Universal Terrestrial Radio Access Network, NTP, Network Time Protocol, assymetric network, delay

Procedia PDF Downloads 346
20843 Monitoring Synthesis of Biodiesel through Online Density Measurements

Authors: Arnaldo G. de Oliveira, Jr, Matthieu Tubino

Abstract:

The transesterification process of triglycerides with alcohols that occurs during the biodiesel synthesis causes continuous changes in several physical properties of the reaction mixture, such as refractive index, viscosity and density. Amongst them, density can be an useful parameter to monitor the reaction, in order to predict the composition of the reacting mixture and to verify the conversion of the oil into biodiesel. In this context, a system was constructed in order to continuously determine changes in the density of the reacting mixture containing soybean oil, methanol and sodium methoxide (30 % w/w solution in methanol), stirred at 620 rpm at room temperature (about 27 °C). A polyethylene pipe network connected to a peristaltic pump was used in order to collect the mixture and pump it through a coil fixed on the plate of an analytical balance. The collected mass values were used to trace a curve correlating the mass of the system to the reaction time. The density variation profile versus the time clearly shows three different steps: 1) the dispersion of methanol in oil causes a decrease in the system mass due to the lower alcohol density followed by stabilization; 2) the addition of the catalyst (sodium methoxide) causes a larger decrease in mass compared to the first step (dispersion of methanol in oil) because of the oil conversion into biodiesel; 3) the final stabilization, denoting the end of the reaction. This density variation profile provides information that was used to predict the composition of the mixture over the time and the reaction rate. The precise knowledge of the duration of the synthesis means saving time and resources on a scale production system. This kind of monitoring provides several interesting features such as continuous measurements without collecting aliquots.

Keywords: biodiesel, density measurements, online continuous monitoring, synthesis

Procedia PDF Downloads 564
20842 Predicting Stack Overflow Accepted Answers Using Features and Models with Varying Degrees of Complexity

Authors: Osayande Pascal Omondiagbe, Sherlock a Licorish

Abstract:

Stack Overflow is a popular community question and answer portal which is used by practitioners to solve technology-related challenges during software development. Previous studies have shown that this forum is becoming a substitute for official software programming languages documentation. While tools have looked to aid developers by presenting interfaces to explore Stack Overflow, developers often face challenges searching through many possible answers to their questions, and this extends the development time. To this end, researchers have provided ways of predicting acceptable Stack Overflow answers by using various modeling techniques. However, less interest is dedicated to examining the performance and quality of typically used modeling methods, and especially in relation to models’ and features’ complexity. Such insights could be of practical significance to the many practitioners that use Stack Overflow. This study examines the performance and quality of various modeling methods that are used for predicting acceptable answers on Stack Overflow, drawn from 2014, 2015 and 2016. Our findings reveal significant differences in models’ performance and quality given the type of features and complexity of models used. Researchers examining classifiers’ performance and quality and features’ complexity may leverage these findings in selecting suitable techniques when developing prediction models.

Keywords: feature selection, modeling and prediction, neural network, random forest, stack overflow

Procedia PDF Downloads 117
20841 Team Cognitive Heterogeneity and Strategic Decision-Making Flexibility: The Role of Transactive Memory System and Task Complexity

Authors: Rui Xing, Baolin Ye, Nan Zhou, Guohong Wang

Abstract:

Drawing upon a perspective of cognitive interaction, this study explores the relationship between team cognitive heterogeneity and team strategic decision-making flexibility, treating the transactive memory system as a mediator and task complexity as a moderator. The hypotheses were tested in linear regression models by using data gathered from 67 strategic decision-making teams in the new-energy vehicle industry. It is found that team cognitive heterogeneity has a positive impact on strategic decision-making flexibility through the mediation of specialization and coordination of the transactive memory system, which is positively moderated by task complexity.

Keywords: strategic decision-making flexibility, team cognitive heterogeneity, transactive memory system, task complexity

Procedia PDF Downloads 53
20840 A Holistic Workflow Modeling Method for Business Process Redesign

Authors: Heejung Lee

Abstract:

In a highly competitive environment, it becomes more important to shorten the whole business process while delivering or even enhancing the business value to the customers and suppliers. Although the workflow management systems receive much attention for its capacity to practically support the business process enactment, the effective workflow modeling method remain still challenging and the high degree of process complexity makes it more difficult to gain the short lead time. This paper presents a workflow structuring method in a holistic way that can reduce the process complexity using activity-needs and formal concept analysis, which eventually enhances the key performance such as quality, delivery, and cost in business process.

Keywords: workflow management, re-engineering, formal concept analysis, business process

Procedia PDF Downloads 389
20839 Electroencephalography (EEG) Analysis of Alcoholic and Control Subjects Using Multiscale Permutation Entropy

Authors: Lal Hussain, Wajid Aziz, Sajjad Ahmed Nadeem, Saeed Arif Shah, Abdul Majid

Abstract:

Brain electrical activity as reflected in Electroencephalography (EEG) have been analyzed and diagnosed using various techniques. Among them, complexity measure, nonlinearity, disorder, and unpredictability play vital role due to the nonlinear interconnection between functional and anatomical subsystem emerged in brain in healthy state and during various diseases. There are many social and economical issues of alcoholic abuse as memory weakness, decision making, impairments, and concentrations etc. Alcoholism not only defect the brains but also associated with emotional, behavior, and cognitive impairments damaging the white and gray brain matters. A recently developed signal analysis method i.e. Multiscale Permutation Entropy (MPE) is proposed to estimate the complexity of long-range temporal correlation time series EEG of Alcoholic and Control subjects acquired from University of California Machine Learning repository and results are compared with MSE. Using MPE, coarsed grained series is first generated and the PE is computed for each coarsed grained time series against the electrodes O1, O2, C3, C4, F2, F3, F4, F7, F8, Fp1, Fp2, P3, P4, T7, and T8. The results computed against each electrode using MPE gives higher significant values as compared to MSE as well as mean rank differences accordingly. Likewise, ROC and Area under the ROC also gives higher separation against each electrode using MPE in comparison to MSE.

Keywords: electroencephalogram (EEG), multiscale permutation entropy (MPE), multiscale sample entropy (MSE), permutation entropy (PE), mann whitney test (MMT), receiver operator curve (ROC), complexity measure

Procedia PDF Downloads 473
20838 Variable Tree Structure QR Decomposition-M Algorithm (QRD-M) in Multiple Input Multiple Output-Orthogonal Frequency Division Multiplexing (MIMO-OFDM) Systems

Authors: Jae-Hyun Ro, Jong-Kwang Kim, Chang-Hee Kang, Hyoung-Kyu Song

Abstract:

In multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM) systems, QR decomposition-M algorithm (QRD-M) has suboptimal error performance. However, the QRD-M has still high complexity due to many calculations at each layer in tree structure. To reduce the complexity of the QRD-M, proposed QRD-M modifies existing tree structure by eliminating unnecessary candidates at almost whole layers. The method of the elimination is discarding the candidates which have accumulated squared Euclidean distances larger than calculated threshold. The simulation results show that the proposed QRD-M has same bit error rate (BER) performance with lower complexity than the conventional QRD-M.

Keywords: complexity, MIMO-OFDM, QRD-M, squared Euclidean distance

Procedia PDF Downloads 317
20837 A Comprehensive Evaluation of Supervised Machine Learning for the Phase Identification Problem

Authors: Brandon Foggo, Nanpeng Yu

Abstract:

Power distribution circuits undergo frequent network topology changes that are often left undocumented. As a result, the documentation of a circuit’s connectivity becomes inaccurate with time. The lack of reliable circuit connectivity information is one of the biggest obstacles to model, monitor, and control modern distribution systems. To enhance the reliability and efficiency of electric power distribution systems, the circuit’s connectivity information must be updated periodically. This paper focuses on one critical component of a distribution circuit’s topology - the secondary transformer to phase association. This topology component describes the set of phase lines that feed power to a given secondary transformer (and therefore a given group of power consumers). Finding the documentation of this component is call Phase Identification, and is typically performed with physical measurements. These measurements can take time lengths on the order of several months, but with supervised learning, the time length can be reduced significantly. This paper compares several such methods applied to Phase Identification for a large range of real distribution circuits, describes a method of training data selection, describes preprocessing steps unique to the Phase Identification problem, and ultimately describes a method which obtains high accuracy (> 96% in most cases, > 92% in the worst case) using only 5% of the measurements typically used for Phase Identification.

Keywords: distribution network, machine learning, network topology, phase identification, smart grid

Procedia PDF Downloads 280
20836 The Effects of Anthropomorphism on Complex Technological Innovations

Authors: Chyi Jaw

Abstract:

Many companies have suffered as a result of consumers’ rejection of complex new products and experienced huge losses in the market. Marketers have to understand what block from new technology adoption or positive product attitude may exist in the market. This research examines the effects of techno-complexity and anthropomorphism on consumer psychology and product attitude when new technologies are introduced to the market. This study conducted a pretest and a 2 x 2 between-subjects experiment. Four simulated experimental web pages were constructed to collect data. The empirical analysis tested the moderation-mediation relationships among techno-complexity, technology anxiety, ability, and product attitude. These empirical results indicate (1) Techno-complexity of an innovation is negatively related to consumers’ product attitude, as well as increases consumers’ technology anxiety and reduces their self-ability perception. (2) Consumers’ technology anxiety and ability perception towards an innovation completely mediate the relationship between techno-complexity and product attitude. (3) Product anthropomorphism is positively related to consumers’ attitude of new technology, and also significantly moderates the effect of techno-complexity in the hypothesized model. In this work, the study presents the moderation-mediation model and the effects of anthropomorphized strategy, which describes how managers can better predict and influence the diffusion of complex technological innovations.

Keywords: ability, anthropomorphic effect, innovation, techno-complexity, technology anxiety

Procedia PDF Downloads 176
20835 Robust Numerical Scheme for Pricing American Options under Jump Diffusion Models

Authors: Salah Alrabeei, Mohammad Yousuf

Abstract:

The goal of option pricing theory is to help the investors to manage their money, enhance returns and control their financial future by theoretically valuing their options. However, most of the option pricing models have no analytical solution. Furthermore, not all the numerical methods are efficient to solve these models because they have nonsmoothing payoffs or discontinuous derivatives at the exercise price. In this paper, we solve the American option under jump diffusion models by using efficient time-dependent numerical methods. several techniques are integrated to reduced the overcome the computational complexity. Fast Fourier Transform (FFT) algorithm is used as a matrix-vector multiplication solver, which reduces the complexity from O(M2) into O(M logM). Partial fraction decomposition technique is applied to rational approximation schemes to overcome the complexity of inverting polynomial of matrices. The proposed method is easy to implement on serial or parallel versions. Numerical results are presented to prove the accuracy and efficiency of the proposed method.

Keywords: integral differential equations, jump–diffusion model, American options, rational approximation

Procedia PDF Downloads 100
20834 Validation of the Formula for Air Attenuation Coefficient for Acoustic Scale Models

Authors: Katarzyna Baruch, Agata Szelag, Aleksandra Majchrzak, Tadeusz Kamisinski

Abstract:

Methodology of measurement of sound absorption coefficient in scaled models is based on the ISO 354 standard. The measurement is realised indirectly - the coefficient is calculated from the reverberation time of an empty chamber as well as a chamber with an inserted sample. It is crucial to maintain the atmospheric conditions stable during both measurements. Possible differences may be amended basing on the formulas for atmospheric attenuation coefficient α given in ISO 9613-1. Model studies require scaling particular factors in compliance with specified characteristic numbers. For absorption coefficient measurement, these are for example: frequency range or the value of attenuation coefficient m. Thanks to the possibilities of modern electroacoustic transducers, it is no longer a problem to scale the frequencies which have to be proportionally higher. However, it may be problematic to reduce values of the attenuation coefficient. It is practically obtained by drying the air down to a defined relative humidity. Despite the change of frequency range and relative humidity of the air, ISO 9613-1 standard still allows the calculation of the amendment for little differences of the atmospheric conditions in the chamber during measurements. The paper discusses a number of theoretical analyses and experimental measurements performed in order to obtain consistency between the values of attenuation coefficient calculated from the formulas given in the standard and by measurement. The authors performed measurements of reverberation time in a chamber made in a 1/8 scale in a corresponding frequency range, i.e. 800 Hz - 40 kHz and in different values of the relative air humidity (40% 5%). Based on the measurements, empirical values of attenuation coefficient were calculated and compared with theoretical ones. In general, the values correspond with each other, but for high frequencies and low values of relative air humidity the differences are significant. Those discrepancies may directly influence the values of measured sound absorption coefficient and cause errors. Therefore, the authors made an effort to determine an amendment minimizing described inaccuracy.

Keywords: air absorption correction, attenuation coefficient, dimensional analysis, model study, scaled modelling

Procedia PDF Downloads 404
20833 Analysis of Diabetes Patients Using Pearson, Cost Optimization, Control Chart Methods

Authors: Devatha Kalyan Kumar, R. Poovarasan

Abstract:

In this paper, we have taken certain important factors and health parameters of diabetes patients especially among children by birth (pediatric congenital) where using the above three metrics methods we are going to assess the importance of each attributes in the dataset and thereby determining the most highly responsible and co-related attribute causing diabetics among young patients. We use cost optimization, control chart and Spearmen methodologies for the real-time application of finding the data efficiency in this diabetes dataset. The Spearmen methodology is the correlation methodologies used in software development process to identify the complexity between the various modules of the software. Identifying the complexity is important because if the complexity is higher, then there is a higher chance of occurrence of the risk in the software. With the use of control; chart mean, variance and standard deviation of data are calculated. With the use of Cost optimization model, we find to optimize the variables. Hence we choose the Spearmen, control chart and cost optimization methods to assess the data efficiency in diabetes datasets.

Keywords: correlation, congenital diabetics, linear relationship, monotonic function, ranking samples, pediatric

Procedia PDF Downloads 244