Search results for: data interpolating empirical orthogonal function
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 29220

Search results for: data interpolating empirical orthogonal function

29160 The Impact of Board Director Characteristics on the Quality of Information Disclosure

Authors: Guo Jinhong

Abstract:

The purpose of this study is to explore the association between board member functions and information disclosure levels. Based on the literature variables, such as the characteristics of the board of directors in the past, a single comprehensive indicator is established as a substitute variable for board functions, and the information disclosure evaluation results published by the Securities and Foundation are used to measure the information disclosure level of the company. This study focuses on companies listed on the Taiwan Stock Exchange from 2006 to 2010 and uses descriptive statistical analysis, univariate analysis, correlation analysis and ordered normal probability (Ordered Probit) regression for empirical analysis. The empirical results show that there is a significant positive correlation between the function of board members and the level of information disclosure. This study also conducts a sensitivity test and draws similar conclusions, showing that boards with better board member functions have higher levels of information disclosure. In addition, this study also found that higher board independence, lower director shareholding pledge ratio, higher director shareholding ratio, and directors with rich professional knowledge and practical experience can help improve the level of information disclosure. The empirical results of this study provide strong support for the "relative regulations to improve the level of information disclosure" formulated by the competent authorities in recent years.

Keywords: function of board members, information disclosure, securities, foundation

Procedia PDF Downloads 79
29159 Social Innovation Rediscovered: An Analysis of Empirical Research

Authors: Imen Douzi, Karim Ben Kahla

Abstract:

In spite of the growing attention for social innovation, it is still considered to be in a stage of infancy with minimal progress in theory development. Upon examining the field of study, one would have to conclude that, over the past two decades, academic research has focused primarily on establishing a conceptual foundation. This has resulted in a considerable stream of conceptual papers which have outnumbered empirical articles. Nevertheless, despite its growing popularity, scholars and practitioners are far from reaching a consensus as to what social innovation actually means which resulted in competing definitions and approaches within the field of social innovation and lack of unifying conceptual framework. This paper reviews empirical research studies on social innovation, classifies them along three dimensions and summarizes research findings for each of these dimensions. Preliminary to the analysis of empirical researches, an overview of different perspectives of social innovation is presented.

Keywords: analysis of empirical research, definition, empirical research, social innovation perspectives

Procedia PDF Downloads 359
29158 Coverage Probability Analysis of WiMAX Network under Additive White Gaussian Noise and Predicted Empirical Path Loss Model

Authors: Chaudhuri Manoj Kumar Swain, Susmita Das

Abstract:

This paper explores a detailed procedure of predicting a path loss (PL) model and its application in estimating the coverage probability in a WiMAX network. For this a hybrid approach is followed in predicting an empirical PL model of a 2.65 GHz WiMAX network deployed in a suburban environment. Data collection, statistical analysis, and regression analysis are the phases of operations incorporated in this approach and the importance of each of these phases has been discussed properly. The procedure of collecting data such as received signal strength indicator (RSSI) through experimental set up is demonstrated. From the collected data set, empirical PL and RSSI models are predicted with regression technique. Furthermore, with the aid of the predicted PL model, essential parameters such as PL exponent as well as the coverage probability of the network are evaluated. This research work may assist in the process of deployment and optimisation of any cellular network significantly.

Keywords: WiMAX, RSSI, path loss, coverage probability, regression analysis

Procedia PDF Downloads 148
29157 Throughput of Point Coordination Function (PCF)

Authors: Faisel Eltuhami Alzaalik, Omar Imhemed Alramli, Ahmed Mohamed Elaieb

Abstract:

The IEEE 802.11 defines two modes of MAC, distributed coordination function (DCF) and point coordination function (PCF) mode. The first sub-layer of the MAC is the distributed coordination function (DCF). A contention algorithm is used via DCF to provide access to all traffic. The point coordination function (PCF) is the second sub-layer used to provide contention-free service. PCF is upper DCF and it uses features of DCF to establish guarantee access of its users. Some papers and researches that have been published in this technology were reviewed in this paper, as well as talking briefly about the distributed coordination function (DCF) technology. The simulation of the PCF function have been applied by using a simulation program called network simulator (NS2) and have been found out the throughput of a transmitter system by using this function.

Keywords: DCF, PCF, throughput, NS2

Procedia PDF Downloads 558
29156 A New IFO Estimation Scheme for Orthogonal Frequency Division Multiplexing Systems

Authors: Keunhong Chae, Seokho Yoon

Abstract:

We address a new integer frequency offset (IFO) estimation scheme with an aid of a pilot for orthogonal frequency division multiplexing systems. After correlating each continual pilot with a predetermined scattered pilot, the correlation value is again correlated to alleviate the influence of the timing offset. From numerical results, it is demonstrated that the influence of the timing offset on the IFO estimation is significantly decreased.

Keywords: estimation, integer frequency offset, OFDM, timing offset

Procedia PDF Downloads 542
29155 Numerical Study on Vortex-Driven Pressure Oscillation and Roll Torque Characteristics in a SRM with Two Inhibitors

Authors: Ji-Seok Hong, Hee-Jang Moon, Hong-Gye Sung

Abstract:

The details of flow structures and the coupling mechanism between vortex shedding and acoustic excitation in a solid rocket motor with two inhibitors have been investigated using 3D Large Eddy Simulation (LES) and Proper Orthogonal Decomposition (POD) analysis. The oscillation frequencies and vortex shedding periods from two inhibitors compare reasonably well with the experimental data and numerical result. A total of four different locations of the rear inhibitor has been numerically tested to characterize the coupling relation of vortex shedding frequency and acoustic mode. The major source of triggering pressure oscillation in the combustor is the resonance with the acoustic longitudinal half mode. It was observed that the counter-rotating vortices in the nozzle flow produce roll torque.

Keywords: large eddy simulation, proper orthogonal decomposition, SRM instability, flow-acoustic coupling

Procedia PDF Downloads 541
29154 A Hybrid Data-Handler Module Based Approach for Prioritization in Quality Function Deployment

Authors: P. Venu, Joeju M. Issac

Abstract:

Quality Function Deployment (QFD) is a systematic technique that creates a platform where the customer responses can be positively converted to design attributes. The accuracy of a QFD process heavily depends on the data that it is handling which is captured from customers or QFD team members. Customized computer programs that perform Quality Function Deployment within a stipulated time have been used by various companies across the globe. These programs heavily rely on storage and retrieval of the data on a common database. This database must act as a perfect source with minimum missing values or error values in order perform actual prioritization. This paper introduces a missing/error data handler module which uses Genetic Algorithm and Fuzzy numbers. The prioritization of customer requirements of sesame oil is illustrated and a comparison is made between proposed data handler module-based deployment and manual deployment.

Keywords: hybrid data handler, QFD, prioritization, module-based deployment

Procedia PDF Downloads 276
29153 Estimation of the Upper Tail Dependence Coefficient for Insurance Loss Data Using an Empirical Copula-Based Approach

Authors: Adrian O'Hagan, Robert McLoughlin

Abstract:

Considerable focus in the world of insurance risk quantification is placed on modeling loss values from lines of business (LOBs) that possess upper tail dependence. Copulas such as the Joe, Gumbel and Student-t copula may be used for this purpose. The copula structure imparts a desired level of tail dependence on the joint distribution of claims from the different LOBs. Alternatively, practitioners may possess historical or simulated data that already exhibit upper tail dependence, through the impact of catastrophe events such as hurricanes or earthquakes. In these circumstances, it is not desirable to induce additional upper tail dependence when modeling the joint distribution of the loss values from the individual LOBs. Instead, it is of interest to accurately assess the degree of tail dependence already present in the data. The empirical copula and its associated upper tail dependence coefficient are presented in this paper as robust, efficient means of achieving this goal.

Keywords: empirical copula, extreme events, insurance loss reserving, upper tail dependence coefficient

Procedia PDF Downloads 267
29152 Theoretical Comparisons and Empirical Illustration of Malmquist, Hicks–Moorsteen, and Luenberger Productivity Indices

Authors: Fatemeh Abbasi, Sahand Daneshvar

Abstract:

Productivity is one of the essential goals of companies to improve performance, which as a strategy-oriented method, determines the basis of the company's economic growth. The history of productivity goes back centuries, but most researchers defined productivity as the relationship between a product and the factors used in production in the early twentieth century. Productivity as the optimal use of available resources means that "more output using less input" can increase companies' economic growth and prosperity capacity. Also, having a quality life based on economic progress depends on productivity growth in that society. Therefore, productivity is a national priority for any developed country. There are several methods for calculating productivity growth measurements that can be divided into parametric and non-parametric methods. Parametric methods rely on the existence of a function in their hypotheses, while non-parametric methods do not require a function based on empirical evidence. One of the most popular non-parametric methods is Data Envelopment Analysis (DEA), which measures changes in productivity over time. The DEA evaluates the productivity of decision-making units (DMUs) based on mathematical models. This method uses multiple inputs and outputs to compare the productivity of similar DMUs such as banks, government agencies, companies, airports, Etc. Non-parametric methods are themselves divided into the frontier and non frontier approaches. The Malmquist productivity index (MPI) proposed by Caves, Christensen, and Diewert (1982), the Hicks–Moorsteen productivity index (HMPI) proposed by Bjurek (1996), or the Luenberger productivity indicator (LPI) proposed by Chambers (2002) are powerful tools for measuring productivity changes over time. This study will compare the Malmquist, Hicks–Moorsteen, and Luenberger indices theoretically and empirically based on DEA models and review their strengths and weaknesses.

Keywords: data envelopment analysis, Hicks–Moorsteen productivity index, Leuenberger productivity indicator, malmquist productivity index

Procedia PDF Downloads 168
29151 5G Future Hyper-Dense Networks: An Empirical Study and Standardization Challenges

Authors: W. Hashim, H. Burok, N. Ghazaly, H. Ahmad Nasir, N. Mohamad Anas, A. F. Ismail, K. L. Yau

Abstract:

Future communication networks require devices that are able to work on a single platform but support heterogeneous operations which lead to service diversity and functional flexibility. This paper proposes two cognitive mechanisms termed cognitive hybrid function which is applied in multiple broadband user terminals in order to maintain reliable connectivity and preventing unnecessary interferences. By employing such mechanisms especially for future hyper-dense network, we can observe their performances in terms of optimized speed and power saving efficiency. Results were obtained from several empirical laboratory studies. It was found that selecting reliable network had shown a better optimized speed performance up to 37% improvement as compared without such function. In terms of power adjustment, our evaluation of this mechanism can reduce the power to 5dB while maintaining the same level of throughput at higher power performance. We also discuss the issues impacting future telecommunication standards whenever such devices get in place.

Keywords: dense network, intelligent network selection, multiple networks, transmit power adjustment

Procedia PDF Downloads 354
29150 U.S. Trade and Trade Balance with China: Testing for Marshall-Lerner Condition and the J-Curve Hypothesis

Authors: Anisul Islam

Abstract:

The U.S. has a very strong trade relationship with China but with a large and persistent trade deficit. Some has argued that the undervalued Chinese Yuan is to be blamed for the persistent trade deficit. The empirical results are mixed at best. This paper empirically estimates the U.S. export function along with the U.S. import function with its trade with China with the purpose of testing for the existence of the Marshall-Lerner (ML) condition as well for the possible existence of the J-curve hypothesis. Annual export and import data will be utilized for as long as the time series data exists. The export and import functions will be estimated using advanced econometric techniques, along with appropriate diagnostic tests performed to examine the validity and reliability of the estimated results. The annual time-series data covers from 1975 to 2022 with a sample size of 48 years, the longest period ever utilized before in any previous study. The data is collected from several sources, such as the World Bank’s World Development Indicators, IMF Financial Statistics, IMF Direction of Trade Statistics, and several other sources. The paper is expected to shed important light on the ongoing debate regarding the persistent U.S. trade deficit with China and the policies that may be useful to reduce such deficits over time. As such, the paper will be of great interest for the academics, researchers, think tanks, global organizations, and policy makers in both China and the U.S.

Keywords: exports, imports, marshall-lerner condition, j-curve hypothesis, united states, china

Procedia PDF Downloads 43
29149 Optimizing Approach for Sifting Process to Solve a Common Type of Empirical Mode Decomposition Mode Mixing

Authors: Saad Al-Baddai, Karema Al-Subari, Elmar Lang, Bernd Ludwig

Abstract:

Empirical mode decomposition (EMD), a new data-driven of time-series decomposition, has the advantage of supposing that a time series is non-linear or non-stationary, as is implicitly achieved in Fourier decomposition. However, the EMD suffers of mode mixing problem in some cases. The aim of this paper is to present a solution for a common type of signals causing of EMD mode mixing problem, in case a signal suffers of an intermittency. By an artificial example, the solution shows superior performance in terms of cope EMD mode mixing problem comparing with the conventional EMD and Ensemble Empirical Mode decomposition (EEMD). Furthermore, the over-sifting problem is also completely avoided; and computation load is reduced roughly six times compared with EEMD, an ensemble number of 50.

Keywords: empirical mode decomposition (EMD), mode mixing, sifting process, over-sifting

Procedia PDF Downloads 367
29148 Indoor Robot Positioning with Precise Correlation Computations over Walsh-Coded Lightwave Signal Sequences

Authors: Jen-Fa Huang, Yu-Wei Chiu, Jhe-Ren Cheng

Abstract:

Visible light communication (VLC) technique has become useful method via LED light blinking. Several issues on indoor mobile robot positioning with LED blinking are examined in the paper. In the transmitter, we control the transceivers blinking message. Orthogonal Walsh codes are adopted for such purpose on auto-correlation function (ACF) to detect signal sequences. In the robot receiver, we set the frame of time by 1 ns passing signal from the transceiver to the mobile robot. After going through many periods of time detecting the peak value of ACF in the mobile robot. Moreover, the transceiver transmits signal again immediately. By capturing three times of peak value, we can know the time difference of arrival (TDOA) between two peak value intervals and finally analyze the accuracy of the robot position.

Keywords: Visible Light Communication, Auto-Correlation Function (ACF), peak value of ACF, Time difference of Arrival (TDOA)

Procedia PDF Downloads 294
29147 Data Driven Infrastructure Planning for Offshore Wind farms

Authors: Isha Saxena, Behzad Kazemtabrizi, Matthias C. M. Troffaes, Christopher Crabtree

Abstract:

The calculations done at the beginning of the life of a wind farm are rarely reliable, which makes it important to conduct research and study the failure and repair rates of the wind turbines under various conditions. This miscalculation happens because the current models make a simplifying assumption that the failure/repair rate remains constant over time. This means that the reliability function is exponential in nature. This research aims to create a more accurate model using sensory data and a data-driven approach. The data cleaning and data processing is done by comparing the Power Curve data of the wind turbines with SCADA data. This is then converted to times to repair and times to failure timeseries data. Several different mathematical functions are fitted to the times to failure and times to repair data of the wind turbine components using Maximum Likelihood Estimation and the Posterior expectation method for Bayesian Parameter Estimation. Initial results indicate that two parameter Weibull function and exponential function produce almost identical results. Further analysis is being done using the complex system analysis considering the failures of each electrical and mechanical component of the wind turbine. The aim of this project is to perform a more accurate reliability analysis that can be helpful for the engineers to schedule maintenance and repairs to decrease the downtime of the turbine.

Keywords: reliability, bayesian parameter inference, maximum likelihood estimation, weibull function, SCADA data

Procedia PDF Downloads 53
29146 Optimal Harmonic Filters Design of Taiwan High Speed Rail Traction System

Authors: Ying-Pin Chang

Abstract:

This paper presents a method for combining a particle swarm optimization with nonlinear time-varying evolution and orthogonal arrays (PSO-NTVEOA) in the planning of harmonic filters for the high speed railway traction system with specially connected transformers in unbalanced three-phase power systems. The objective is to minimize the cost of the filter, the filters loss, the total harmonic distortion of currents and voltages at each bus simultaneously. An orthogonal array is first conducted to obtain the initial solution set. The set is then treated as the initial training sample. Next, the PSO-NTVEOA method parameters are determined by using matrix experiments with an orthogonal array, in which a minimal number of experiments would have an effect that approximates the full factorial experiments. This PSO-NTVEOA method is then applied to design optimal harmonic filters in Taiwan High Speed Rail (THSR) traction system, where both rectifiers and inverters with IGBT are used. From the results of the illustrative examples, the feasibility of the PSO-NTVEOA to design an optimal passive harmonic filter of THSR system is verified and the design approach can greatly reduce the harmonic distortion. Three design schemes are compared that V-V connection suppressing the 3rd order harmonic, and Scott and Le Blanc connection for the harmonic improvement is better than the V-V connection.

Keywords: harmonic filters, particle swarm optimization, nonlinear time-varying evolution, orthogonal arrays, specially connected transformers

Procedia PDF Downloads 372
29145 Use of In-line Data Analytics and Empirical Model for Early Fault Detection

Authors: Hyun-Woo Cho

Abstract:

Automatic process monitoring schemes are designed to give early warnings for unusual process events or abnormalities as soon as possible. For this end, various techniques have been developed and utilized in various industrial processes. It includes multivariate statistical methods, representation skills in reduced spaces, kernel-based nonlinear techniques, etc. This work presents a nonlinear empirical monitoring scheme for batch type production processes with incomplete process measurement data. While normal operation data are easy to get, unusual fault data occurs infrequently and thus are difficult to collect. In this work, noise filtering steps are added in order to enhance monitoring performance by eliminating irrelevant information of the data. The performance of the monitoring scheme was demonstrated using batch process data. The results showed that the monitoring performance was improved significantly in terms of detection success rate of process fault.

Keywords: batch process, monitoring, measurement, kernel method

Procedia PDF Downloads 303
29144 Extracting the Coupled Dynamics in Thin-Walled Beams from Numerical Data Bases

Authors: Mohammad A. Bani-Khaled

Abstract:

In this work we use the Discrete Proper Orthogonal Decomposition transform to characterize the properties of coupled dynamics in thin-walled beams by exploiting numerical simulations obtained from finite element simulations. The outcomes of the will improve our understanding of the linear and nonlinear coupled behavior of thin-walled beams structures. Thin-walled beams have widespread usage in modern engineering application in both large scale structures (aeronautical structures), as well as in nano-structures (nano-tubes). Therefore, detailed knowledge in regard to the properties of coupled vibrations and buckling in these structures are of great interest in the research community. Due to the geometric complexity in the overall structure and in particular in the cross-sections it is necessary to involve computational mechanics to numerically simulate the dynamics. In using numerical computational techniques, it is not necessary to over simplify a model in order to solve the equations of motions. Computational dynamics methods produce databases of controlled resolution in time and space. These numerical databases contain information on the properties of the coupled dynamics. In order to extract the system dynamic properties and strength of coupling among the various fields of the motion, processing techniques are required. Time- Proper Orthogonal Decomposition transform is a powerful tool for processing databases for the dynamics. It will be used to study the coupled dynamics of thin-walled basic structures. These structures are ideal to form a basis for a systematic study of coupled dynamics in structures of complex geometry.

Keywords: coupled dynamics, geometric complexity, proper orthogonal decomposition (POD), thin walled beams

Procedia PDF Downloads 402
29143 Geospatial Curve Fitting Methods for Disease Mapping of Tuberculosis in Eastern Cape Province, South Africa

Authors: Davies Obaromi, Qin Yongsong, James Ndege

Abstract:

To interpolate scattered or regularly distributed data, there are imprecise or exact methods. However, there are some of these methods that could be used for interpolating data in a regular grid and others in an irregular grid. In spatial epidemiology, it is important to examine how a disease prevalence rates are distributed in space, and how they relate with each other within a defined distance and direction. In this study, for the geographic and graphic representation of the disease prevalence, linear and biharmonic spline methods were implemented in MATLAB, and used to identify, localize and compare for smoothing in the distribution patterns of tuberculosis (TB) in Eastern Cape Province. The aim of this study is to produce a more “smooth” graphical disease map for TB prevalence patterns by a 3-D curve fitting techniques, especially the biharmonic splines that can suppress noise easily, by seeking a least-squares fit rather than exact interpolation. The datasets are represented generally as a 3D or XYZ triplets, where X and Y are the spatial coordinates and Z is the variable of interest and in this case, TB counts in the province. This smoothing spline is a method of fitting a smooth curve to a set of noisy observations using a spline function, and it has also become the conventional method for its high precision, simplicity and flexibility. Surface and contour plots are produced for the TB prevalence at the provincial level for 2012 – 2015. From the results, the general outlook of all the fittings showed a systematic pattern in the distribution of TB cases in the province and this is consistent with some spatial statistical analyses carried out in the province. This new method is rarely used in disease mapping applications, but it has a superior advantage to be assessed at subjective locations rather than only on a rectangular grid as seen in most traditional GIS methods of geospatial analyses.

Keywords: linear, biharmonic splines, tuberculosis, South Africa

Procedia PDF Downloads 219
29142 Weighted Rank Regression with Adaptive Penalty Function

Authors: Kang-Mo Jung

Abstract:

The use of regularization for statistical methods has become popular. The least absolute shrinkage and selection operator (LASSO) framework has become the standard tool for sparse regression. However, it is well known that the LASSO is sensitive to outliers or leverage points. We consider a new robust estimation which is composed of the weighted loss function of the pairwise difference of residuals and the adaptive penalty function regulating the tuning parameter for each variable. Rank regression is resistant to regression outliers, but not to leverage points. By adopting a weighted loss function, the proposed method is robust to leverage points of the predictor variable. Furthermore, the adaptive penalty function gives us good statistical properties in variable selection such as oracle property and consistency. We develop an efficient algorithm to compute the proposed estimator using basic functions in program R. We used an optimal tuning parameter based on the Bayesian information criterion (BIC). Numerical simulation shows that the proposed estimator is effective for analyzing real data set and contaminated data.

Keywords: adaptive penalty function, robust penalized regression, variable selection, weighted rank regression

Procedia PDF Downloads 440
29141 Partially Knowing of Least Support Orthogonal Matching Pursuit (PKLS-OMP) for Recovering Signal

Authors: Israa Sh. Tawfic, Sema Koc Kayhan

Abstract:

Given a large sparse signal, great wishes are to reconstruct the signal precisely and accurately from lease number of measurements as possible as it could. Although this seems possible by theory, the difficulty is in built an algorithm to perform the accuracy and efficiency of reconstructing. This paper proposes a new proved method to reconstruct sparse signal depend on using new method called Least Support Matching Pursuit (LS-OMP) merge it with the theory of Partial Knowing Support (PSK) given new method called Partially Knowing of Least Support Orthogonal Matching Pursuit (PKLS-OMP). The new methods depend on the greedy algorithm to compute the support which depends on the number of iterations. So to make it faster, the PKLS-OMP adds the idea of partial knowing support of its algorithm. It shows the efficiency, simplicity, and accuracy to get back the original signal if the sampling matrix satisfies the Restricted Isometry Property (RIP). Simulation results also show that it outperforms many algorithms especially for compressible signals.

Keywords: compressed sensing, lest support orthogonal matching pursuit, partial knowing support, restricted isometry property, signal reconstruction

Procedia PDF Downloads 222
29140 A Rapid Code Acquisition Scheme in OOC-Based CDMA Systems

Authors: Keunhong Chae, Seokho Yoon

Abstract:

We propose a code acquisition scheme called improved multiple-shift (IMS) for optical code division multiple access systems, where the optical orthogonal code is used instead of the pseudo noise code. Although the IMS algorithm has a similar process to that of the conventional MS algorithm, it has a better code acquisition performance than the conventional MS algorithm. We analyze the code acquisition performance of the IMS algorithm and compare the code acquisition performances of the MS and the IMS algorithms in single-user and multi-user environments.

Keywords: code acquisition, optical CDMA, optical orthogonal code, serial algorithm

Procedia PDF Downloads 514
29139 Influence of Build Orientation on Machinability of Selective Laser Melted Titanium Alloy-Ti-6Al-4V

Authors: Manikandakumar Shunmugavel, Ashwin Polishetty, Moshe Goldberg, Junior Nomani, Guy Littlefair

Abstract:

Selective laser melting (SLM), a promising additive manufacturing (AM) technology, has a huge potential in the fabrication of Ti-6Al-4V near-net shape components. However, poor surface finish of the components fabricated from this technology requires secondary machining to achieve the desired accuracy and tolerance. Therefore, a systematic understanding of the machinability of SLM fabricated Ti-6Al-4V components is paramount to improve the productivity and product quality. Considering the significance of machining in SLM fabricated Ti-6Al-4V components, this research aim is to study the influence of build orientation on machinability characteristics by performing low speed orthogonal cutting tests. In addition, the machinability of SLM fabricated Ti-6Al-4V is compared with conventionally produced wrought Ti-6Al-4V to understand the influence of SLM technology on machining. This paper is an attempt to provide evidence to the hypothesis associated that build orientation influences cutting forces, chip formation and surface integrity during orthogonal cutting of SLM Ti-6Al-4V samples. Results obtained from the low speed orthogonal cutting tests highlight the practical importance of microstructure and build orientation on machinability of SLM Ti-6Al-4V.

Keywords: additive manufacturing, build orientation, machinability, titanium alloys (Ti-6Al-4V)

Procedia PDF Downloads 266
29138 An Analysis of the Influence of Employee Readiness for Change on TQM Implementation

Authors: Mohamed Haffar, Khalil Al-Hyari, Mohammed Khair Abu Zaid, Ramadane Djbarni, Mohammed Hamdan

Abstract:

While employee readiness for change (ERFC) is recognised as critical for total quality management (TQM) implementation, there is a lack of systematic and empirical studies regarding the relationship between ERFC dimensions and TQM. Therefore, this study proposes to fill this gap by providing empirical evidence leading to advancement in the understanding of the influences of ERFC components on TQM implementation. The empirical data for this study was drawn from a survey of 400 middle and senior managers of Jordanian firms. The analysis of the collected data, which was conducted using Structural Equation Modeling technique, revealed that three of the ERFC components, namely personally beneficial, change self-efficacy and management support are the most supportive ERFC dimensions for TQM implementation. Therefore, this paper makes a novel contribution by providing a refined and deeper comprehension of the relationships between ERFCs and TQM implementation.

Keywords: total quality management, employee readiness for change, manufacturing organisations, Jordan

Procedia PDF Downloads 533
29137 Performance Analysis of PAPR Reduction in OFDM Systems based on Partial Transmit Sequence (PTS) Technique

Authors: Alcardo Alex Barakabitze, Tan Xiaoheng

Abstract:

Orthogonal Frequency Division Multiplexing (OFDM) is a special case of Multi-Carrier Modulation (MCM) technique which transmits a stream of data over a number of lower data rate subcarriers. OFDM splits the total transmission bandwidth into a number of orthogonal and non-overlapping subcarriers and transmit the collection of bits called symbols in parallel using these subcarriers. This paper explores the Peak to Average Power Reduction (PAPR) using the Partial Transmit Sequence technique. We provide the distribution analysis and the basics of OFDM signals and then show how the PAPR increases as the number of subcarriers increases. We provide the performance analysis of CCDF and PAPR expressed in decibels through MATLAB simulations. The simulation results show that, in PTS technique, the performance of PAPR reduction in OFDM systems improves significantly as the number of sub-blocks increases. However, by keeping the same number of sub-blocks variation, oversampling factor and the number of OFDM blocks’ iteration for generating the CCDF, the OFDM systems with 128 subcarriers have an improved performance in PAPR reduction compared to OFDM systems with 256, 512 or >512 subcarriers.

Keywords: OFDM, peak to average power reduction (PAPR), bit error rate (BER), subcarriers, wireless communications

Procedia PDF Downloads 493
29136 Variability of Surface Air Temperature in Sri Lanka and Its Relation to El Nino Southern Oscillation and Indian Ocean Dipole

Authors: Athdath Waduge Susantha Janaka Kumara, Xiefei Zhi, Zin Mie Mie Sein

Abstract:

Understanding the air temperature variability is crucially important for disaster risk reduction and management. In this study, we used 15 synoptic meteorological stations to assess the spatiotemporal variability of air temperature over Sri Lanka during 1972–2021. The empirical orthogonal function (EOF), Principal component analysis (PCA), Mann-Kendall test, power spectrum analysis and correlation coefficient analysis were used to investigate the long-term trends of air temperature and their possible relation to sea surface temperature (SST) over the region. The results indicate that an increasing trend in air temperature was observed with the abrupt climate change noted in the year 1994. The spatial distribution of EOF1 (63.5%) shows the positive and negative loading dipole patterns from south to northeast, while EOF2 (23.4%) explains warmer (colder) in some parts of central (south and east) areas. The power spectrum of PC1 (PC2) indicates that there is a significant period of 3-4 years (quasi-2 years). Moreover, Indian Ocean Dipole (IOD) provides a strong positive correlation with the air temperature of Sri Lanka, while the EL Nino Southern Oscillation (ENSO) presents a weak negative correlation. Therefore, IOD events led to higher temperatures in the region. This study’s findings can help disaster risk reduction and management in the country.

Keywords: air temperature, interannaul variability, ENSO, IOD

Procedia PDF Downloads 76
29135 Strategic Citizen Participation in Applied Planning Investigations: How Planners Use Etic and Emic Community Input Perspectives to Fill-in the Gaps in Their Analysis

Authors: John Gaber

Abstract:

Planners regularly use citizen input as empirical data to help them better understand community issues they know very little about. This type of community data is based on the lived experiences of local residents and is known as "emic" data. What is becoming more common practice for planners is their use of data from local experts and stakeholders (known as "etic" data or the outsider perspective) to help them fill in the gaps in their analysis of applied planning research projects. Utilizing international Health Impact Assessment (HIA) data, I look at who planners invite to their citizen input investigations. Research presented in this paper shows that planners access a wide range of emic and etic community perspectives in their search for the “community’s view.” The paper concludes with how planners can chart out a new empirical path in their execution of emic/etic citizen participation strategies in their applied planning research projects.

Keywords: citizen participation, emic data, etic data, Health Impact Assessment (HIA)

Procedia PDF Downloads 469
29134 Efficient Estimation for the Cox Proportional Hazards Cure Model

Authors: Khandoker Akib Mohammad

Abstract:

While analyzing time-to-event data, it is possible that a certain fraction of subjects will never experience the event of interest, and they are said to be cured. When this feature of survival models is taken into account, the models are commonly referred to as cure models. In the presence of covariates, the conditional survival function of the population can be modelled by using the cure model, which depends on the probability of being uncured (incidence) and the conditional survival function of the uncured subjects (latency), and a combination of logistic regression and Cox proportional hazards (PH) regression is used to model the incidence and latency respectively. In this paper, we have shown the asymptotic normality of the profile likelihood estimator via asymptotic expansion of the profile likelihood and obtain the explicit form of the variance estimator with an implicit function in the profile likelihood. We have also shown the efficient score function based on projection theory and the profile likelihood score function are equal. Our contribution in this paper is that we have expressed the efficient information matrix as the variance of the profile likelihood score function. A simulation study suggests that the estimated standard errors from bootstrap samples (SMCURE package) and the profile likelihood score function (our approach) are providing similar and comparable results. The numerical result of our proposed method is also shown by using the melanoma data from SMCURE R-package, and we compare the results with the output obtained from the SMCURE package.

Keywords: Cox PH model, cure model, efficient score function, EM algorithm, implicit function, profile likelihood

Procedia PDF Downloads 118
29133 Wavelet Based Signal Processing for Fault Location in Airplane Cable

Authors: Reza Rezaeipour Honarmandzad

Abstract:

Wavelet analysis is an exciting method for solving difficult problems in mathematics, physics, and engineering, with modern applications as diverse as wave propagation, data compression, signal processing, image processing, pattern recognition, etc. Wavelets allow complex information such as signals, images and patterns to be decomposed into elementary forms at different positions and scales and subsequently reconstructed with high precision. In this paper a wavelet-based signal processing algorithm for airplane cable fault location is proposed. An orthogonal discrete wavelet decomposition and reconstruction algorithm is used to eliminate the noise in the aircraft cable fault signal. The experiment result has shown that the character of emission pulse and reflect pulse used to test the aircraft cable fault point are reserved and the high-frequency noise are eliminated by means of the proposed algorithm in this paper.

Keywords: wavelet analysis, signal processing, orthogonal discrete wavelet, noise, aircraft cable fault signal

Procedia PDF Downloads 495
29132 Hot Forging Process Simulation of Outer Tie Rod to Reduce Forming Load

Authors: Kyo Jin An, Bukyo Seo, Young-Chul Park

Abstract:

The current trend in car market is increase of parts of automobile and weight in vehicle. It comes from improvement of vehicle performance. Outer tie rod is a part of component of steering system and it is lighter than the others. But, weight lightening is still required for improvement of car mileage. So, we have presented a model of aluminized outer tie rod, but the process of fabrication has to be checked to manufacture the product. Therefore, we have anticipated forming load, die stress and abrasion to use the program of forging interpretation in the part of hot forging process of outer tie rod in this study. Also, we have implemented the experiments design to use the table of orthogonal arrays to reduce the forming load.

Keywords: forming load, hot forging, orthogonal array, outer tie rod (OTR), multi–step forging

Procedia PDF Downloads 411
29131 The Twin Terminal of Pedestrian Trajectory Based on City Intelligent Model (CIM) 4.0

Authors: Chen Xi, Lao Xuerui, Li Junjie, Jiang Yike, Wang Hanwei, Zeng Zihao

Abstract:

To further promote the development of smart cities, the microscopic "nerve endings" of the City Intelligent Model (CIM) are extended to be more sensitive. In this paper, we develop a pedestrian trajectory twin terminal based on the CIM and CNN technology. It also uses 5G networks, architectural and geoinformatics technologies, convolutional neural networks, combined with deep learning networks for human behaviour recognition models, to provide empirical data such as 'pedestrian flow data and human behavioural characteristics data', and ultimately form spatial performance evaluation criteria and spatial performance warning systems, to make the empirical data accurate and intelligent for prediction and decision making.

Keywords: urban planning, urban governance, CIM, artificial intelligence, convolutional neural network

Procedia PDF Downloads 92