Search results for: λ-levelwise statistical convergence
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4562

Search results for: λ-levelwise statistical convergence

3872 SNR Classification Using Multiple CNNs

Authors: Thinh Ngo, Paul Rad, Brian Kelley

Abstract:

Noise estimation is essential in today wireless systems for power control, adaptive modulation, interference suppression and quality of service. Deep learning (DL) has already been applied in the physical layer for modulation and signal classifications. Unacceptably low accuracy of less than 50% is found to undermine traditional application of DL classification for SNR prediction. In this paper, we use divide-and-conquer algorithm and classifier fusion method to simplify SNR classification and therefore enhances DL learning and prediction. Specifically, multiple CNNs are used for classification rather than a single CNN. Each CNN performs a binary classification of a single SNR with two labels: less than, greater than or equal. Together, multiple CNNs are combined to effectively classify over a range of SNR values from −20 ≤ SNR ≤ 32 dB.We use pre-trained CNNs to predict SNR over a wide range of joint channel parameters including multiple Doppler shifts (0, 60, 120 Hz), power-delay profiles, and signal-modulation types (QPSK,16QAM,64-QAM). The approach achieves individual SNR prediction accuracy of 92%, composite accuracy of 70% and prediction convergence one order of magnitude faster than that of traditional estimation.

Keywords: classification, CNN, deep learning, prediction, SNR

Procedia PDF Downloads 134
3871 Predictive Analytics for Theory Building

Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim

Abstract:

Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.

Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building

Procedia PDF Downloads 276
3870 Attitudinal Change: A Major Therapy for Non–Technical Losses in the Nigerian Power Sector

Authors: Fina O. Faithpraise, Effiong O. Obisung, Azele E. Peter, Chris R. Chatwin

Abstract:

This study investigates and identifies consumer attitude as a major influence that results in non-technical losses in the Nigerian electricity supply sector. This discovery is revealed by the combination of quantitative and qualitative research to complete a survey. The dataset employed is a simple random sampling of households using electricity (public power supply), and the number of units chosen is based on statistical power analysis. The units were subdivided into two categories (household with and without electrical meters). The hypothesis formulated was tested and analyzed using a chi-square statistical method. The results obtained shows that the critical value for the household with electrical prepared meter (EPM) was (9.488 < 427.4) and those without electrical prepared meter (EPMn) was (9.488 < 436.1) with a p-value of 0.01%. The analysis demonstrated so far established the real-time position, which shows that the wrong attitude towards handling the electricity supplied (not turning off light bulbs and electrical appliances when not in use within the rooms and outdoors within 12 hours of the day) characterized the non-technical losses in the power sector. Therefore the adoption of efficient lighting attitudes in individual households as recommended by the researcher is greatly encouraged. The results from this study should serve as a model for energy efficiency and use for the improvement of electricity consumption as well as a stable economy.

Keywords: attitudinal change, household, non-technical losses, prepared meter

Procedia PDF Downloads 179
3869 Statistical Analysis of Parameters Effects on Maximum Strain and Torsion Angle of FRP Honeycomb Sandwich Panels Subjected to Torsion

Authors: Mehdi Modabberifar, Milad Roodi, Ehsan Souri

Abstract:

In recent years, honeycomb fiber reinforced plastic (FRP) sandwich panels have been increasingly used in various industries. Low weight, low price, and high mechanical strength are the benefits of these structures. However, their mechanical properties and behavior have not been fully explored. The objective of this study is to conduct a combined numerical-statistical investigation of honeycomb FRP sandwich beams subject to torsion load. In this paper, the effect of geometric parameters of the sandwich panel on the maximum shear strain in both face and core and angle of torsion in a honeycomb FRP sandwich structures in torsion is investigated. The effect of Parameters including core thickness, face skin thickness, cell shape, cell size, and cell thickness on mechanical behavior of the structure were numerically investigated. Main effects of factors were considered in this paper and regression equations were derived. Taguchi method was employed as experimental design and an optimum parameter combination for the maximum structure stiffness has been obtained. The results showed that cell size and face skin thickness have the most significant impacts on torsion angle, maximum shear strain in face and core.

Keywords: finite element, honeycomb FRP sandwich panel, torsion, civil engineering

Procedia PDF Downloads 418
3868 Approximation of Convex Set by Compactly Semidefinite Representable Set

Authors: Anusuya Ghosh, Vishnu Narayanan

Abstract:

The approximation of convex set by semidefinite representable set plays an important role in semidefinite programming, especially in modern convex optimization. To optimize a linear function over a convex set is a hard problem. But optimizing the linear function over the semidefinite representable set which approximates the convex set is easy to solve as there exists numerous efficient algorithms to solve semidefinite programming problems. So, our approximation technique is significant in optimization. We develop a technique to approximate any closed convex set, say K by compactly semidefinite representable set. Further we prove that there exists a sequence of compactly semidefinite representable sets which give tighter approximation of the closed convex set, K gradually. We discuss about the convergence of the sequence of compactly semidefinite representable sets to closed convex set K. The recession cone of K and the recession cone of the compactly semidefinite representable set are equal. So, we say that the sequence of compactly semidefinite representable sets converge strongly to the closed convex set. Thus, this approximation technique is very useful development in semidefinite programming.

Keywords: semidefinite programming, semidefinite representable set, compactly semidefinite representable set, approximation

Procedia PDF Downloads 386
3867 Music Reading Expertise Facilitates Implicit Statistical Learning of Sentence Structures in a Novel Language: Evidence from Eye Movement Behavior

Authors: Sara T. K. Li, Belinda H. J. Chung, Jeffery C. N. Yip, Janet H. Hsiao

Abstract:

Music notation and text reading both involve statistical learning of music or linguistic structures. However, it remains unclear how music reading expertise influences text reading behavior. The present study examined this issue through an eye-tracking study. Chinese-English bilingual musicians and non-musicians read English sentences, Chinese sentences, musical phrases, and sentences in Tibetan, a language novel to the participants, with their eye movement recorded. Each set of stimuli consisted of two conditions in terms of structural regularity: syntactically correct and syntactically incorrect musical phrases/sentences. They then completed a sentence comprehension (for syntactically correct sentences) or a musical segment/word recognition task afterwards to test their comprehension/recognition abilities. The results showed that in reading musical phrases, as compared with non-musicians, musicians had a higher accuracy in the recognition task, and had shorter reading time, fewer fixations, and shorter fixation duration when reading syntactically correct (i.e., in diatonic key) than incorrect (i.e., in non-diatonic key/atonal) musical phrases. This result reflects their expertise in music reading. Interestingly, in reading Tibetan sentences, which was novel to both participant groups, while non-musicians did not show any behavior differences between reading syntactically correct or incorrect Tibetan sentences, musicians showed a shorter reading time and had marginally fewer fixations when reading syntactically correct sentences than syntactically incorrect ones. However, none of the musicians reported discovering any structural regularities in the Tibetan stimuli after the experiment when being asked explicitly, suggesting that they may have implicitly acquired the structural regularities in Tibetan sentences. This group difference was not observed when they read English or Chinese sentences. This result suggests that music reading expertise facilities reading texts in a novel language (i.e., Tibetan), but not in languages that the readers are already familiar with (i.e., English and Chinese). This phenomenon may be due to the similarities between reading music notations and reading texts in a novel language, as in both cases the stimuli follow particular statistical structures but do not involve semantic or lexical processing. Thus, musicians may transfer their statistical learning skills stemmed from music notation reading experience to implicitly discover structures of sentences in a novel language. This speculation is consistent with a recent finding showing that music reading expertise modulates the processing of English nonwords (i.e., words that do not follow morphological or orthographic rules) but not pseudo- or real words. These results suggest that the modulation of music reading expertise on language processing depends on the similarities in the cognitive processes involved. It also has important implications for the benefits of music education on language and cognitive development.

Keywords: eye movement behavior, eye-tracking, music reading expertise, sentence reading, structural regularity, visual processing

Procedia PDF Downloads 380
3866 The Impact of Public Open Space System on Housing Price in Chicago

Authors: Si Chen, Le Zhang, Xian He

Abstract:

The research explored the influences of public open space system on housing price through hedonic models, in order to support better open space plans and economic policies. We have three initial hypotheses: 1) public open space system has an overall positive influence on surrounding housing prices. 2) Different public open space types have different levels of influence on motivating surrounding housing prices. 3) Walking and driving accessibilities from property to public open spaces have different statistical relation with housing prices. Cook County, Illinois, was chosen to be a study area since data availability, sufficient open space types, and long-term open space preservation strategies. We considered the housing attributes, driving and walking accessibility scores from houses to nearby public open spaces, and driving accessibility scores to hospitals as influential features and used real housing sales price in 2010 as a dependent variable in the built hedonic model. Through ordinary least squares (OLS) regression analysis, General Moran’s I analysis and geographically weighted regression analysis, we observed the statistical relations between public open spaces and housing sale prices in the three built hedonic models and confirmed all three hypotheses.

Keywords: hedonic model, public open space, housing sale price, regression analysis, accessibility score

Procedia PDF Downloads 133
3865 Comparative Study to Evaluate Chronological Age and Dental Age in North Indian Population Using Cameriere Method

Authors: Ranjitkumar Patil

Abstract:

Age estimation has its importance in forensic dentistry. Dental age estimation has emerged as an alternative to skeletal age determination. The methods based on stages of tooth formation, as appreciated on radiographs, seems to be more appropriate in the assessment of age than those based on skeletal development. The study was done to evaluate dental age in north Indian population using Cameriere’s method. Aims/Objectives: The study was conducted to assess the dental age of North Indian children using Cameriere’smethodand to compare the chronological age and dental age for validation of the Cameriere’smethod in the north Indian population. A comparative study of 02 year duration on the OPG (using PLANMECA Promax 3D) data of 497 individuals with age ranging from 5 to 15 years was done based on simple random technique ethical approval obtained from the institutional ethical committee. The data was obtained based on inclusion and exclusion criteria was analyzed by a software for dental age estimation. Statistical analysis: Student’s t test was used to compare the morphological variables of males with those of females and to compare observed age with estimated age. Regression formula was also calculated. Results: Present study was a comparative study of 497 subjects with a distribution between male and female, with their dental age assessed by using Panoramic radiograph, following the method described by Cameriere, which is widely accepted. Statistical analysis in our study indicated that gender does not have a significant influence on age estimation. (R2= 0.787). Conclusion: This infers that cameriere’s method can be effectively applied in north Indianpopulation.

Keywords: Forensic, Chronological Age, Dental Age, Skeletal Age

Procedia PDF Downloads 90
3864 Statistical Analysis of Extreme Flow (Regions of Chlef)

Authors: Bouthiba Amina

Abstract:

The estimation of the statistics bound to the precipitation represents a vast domain, which puts numerous challenges to meteorologists and hydrologists. Sometimes, it is necessary, to approach in value the extreme events for sites where there is little, or no datum, as well as their periods of return. The search for a model of the frequency of the heights of daily rains dresses a big importance in operational hydrology: It establishes a basis for predicting the frequency and intensity of floods by estimating the amount of precipitation in past years. The most known and the most common approach is the statistical approach, It consists in looking for a law of probability that fits best the values observed by the random variable " daily maximal rain " after a comparison of various laws of probability and methods of estimation by means of tests of adequacy. Therefore, a frequent analysis of the annual series of daily maximal rains was realized on the data of 54 pluviometric stations of the pond of high and average. This choice was concerned with five laws usually applied to the study and the analysis of frequent maximal daily rains. The chosen period is from 1970 to 2013. It was of use to the forecast of quantiles. The used laws are the law generalized by extremes to three components, those of the extreme values to two components (Gumbel and log-normal) in two parameters, the law Pearson typifies III and Log-Pearson III in three parameters. In Algeria, Gumbel's law has been used for a long time to estimate the quantiles of maximum flows. However, and we will check and choose the most reliable law.

Keywords: return period, extreme flow, statistics laws, Gumbel, estimation

Procedia PDF Downloads 78
3863 Experimental Design for Formulation Optimization of Nanoparticle of Cilnidipine

Authors: Arti Bagada, Kantilal Vadalia, Mihir Raval

Abstract:

Cilnidipine is practically insoluble in water which results in its insufficient oral bioavailability. The purpose of the present investigation was to formulate cilnidipine nanoparticles by nanoprecipitation method to increase the aqueous solubility and dissolution rate and hence bioavailability by utilizing various experimental statistical design modules. Experimental design were used to investigate specific effects of independent variables during preparation cilnidipine nanoparticles and corresponding responses in optimizing the formulation. Plackett Burman design for independent variables was successfully employed for optimization of nanoparticles of cilnidipine. The influence of independent variables studied were drug concentration, solvent to antisolvent ratio, polymer concentration, stabilizer concentration and stirring speed. The dependent variables namely average particle size, polydispersity index, zeta potential value and saturation solubility of the formulated nanoparticles of cilnidipine. The experiments were carried out according to 13 runs involving 5 independent variables (higher and lower levels) employing Plackett-Burman design. The cilnidipine nanoparticles were characterized by average particle size, polydispersity index value, zeta potential value and saturation solubility and it results were 149 nm, 0.314, 43.24 and 0.0379 mg/ml, respectively. The experimental results were good correlated with predicted data analysed by Plackett-Burman statistical method.

Keywords: dissolution enhancement, nanoparticles, Plackett-Burman design, nanoprecipitation

Procedia PDF Downloads 159
3862 Electrical Cardiac Remodeling in Elite Athletes: A Comparative Study between Triathletes and Cyclists

Authors: Lingxia Li, Frédéric Schnell, Thibault Lachard, Anne-Charlotte Dupont, Shuzhe Ding, Solène Le Douairon Lahaye

Abstract:

Background: Repetitive participation in triathlon training results in significant myocardial changes. However, whether the cardiac remodeling in triathletes is related to the specificities of the sport (consisting of three sports) raises questions. Methods: Elite triathletes and cyclists registered on the French ministerial lists of high-level athletes were involved. The basic information and routine electrocardiogram records were obtained. Electrocardiograms were evaluated according to clinical criteria. Results: Of the 105 athletes included in the study, 42 were from the short-distance triathlon (40%), and 63 were from the road cycling (60%). The average age was 22.1±4.2 years. The P wave amplitude was significantly lower in triathletes than in cyclists (p=0.005), and no significant statistical difference was found in heart rate, RR interval, PR or PQ interval, QRS complex, QRS axe, QT interval, and QTc (p>0.05). All the measured parameters were within normal ranges. The most common electrical manifestations were early repolarization (60.95%) and incomplete right bundle branch block (43.81%); there was no statistical difference between the groups (p>0.05). Conclusions: Prolonged intensive endurance exercise training induces physiological cardiac remodeling in both triathletes and cyclists. The most common electrocardiogram manifestations were early repolarization and incomplete right bundle branch block.

Keywords: cardiac screening, electrocardiogram, triathlon, cycling, elite athletes

Procedia PDF Downloads 6
3861 A Study on Performance Prediction in Early Design Stage of Apartment Housing Using Machine Learning

Authors: Seongjun Kim, Sanghoon Shim, Jinwooung Kim, Jaehwan Jung, Sung-Ah Kim

Abstract:

As the development of information and communication technology, the convergence of machine learning of the ICT area and design is attempted. In this way, it is possible to grasp the correlation between various design elements, which was difficult to grasp, and to reflect this in the design result. In architecture, there is an attempt to predict the performance, which is difficult to grasp in the past, by finding the correlation among multiple factors mainly through machine learning. In architectural design area, some attempts to predict the performance affected by various factors have been tried. With machine learning, it is possible to quickly predict performance. The aim of this study is to propose a model that predicts performance according to the block arrangement of apartment housing through machine learning and the design alternative which satisfies the performance such as the daylight hours in the most similar form to the alternative proposed by the designer. Through this study, a designer can proceed with the design considering various design alternatives and accurate performances quickly from the early design stage.

Keywords: apartment housing, machine learning, multi-objective optimization, performance prediction

Procedia PDF Downloads 481
3860 Antibacterial Evaluation, in Silico ADME and QSAR Studies of Some Benzimidazole Derivatives

Authors: Strahinja Kovačević, Lidija Jevrić, Miloš Kuzmanović, Sanja Podunavac-Kuzmanović

Abstract:

In this paper, various derivatives of benzimidazole have been evaluated against Gram-negative bacteria Escherichia coli. For all investigated compounds the minimum inhibitory concentration (MIC) was determined. Quantitative structure-activity relationships (QSAR) attempts to find consistent relationships between the variations in the values of molecular properties and the biological activity for a series of compounds so that these rules can be used to evaluate new chemical entities. The correlation between MIC and some absorption, distribution, metabolism and excretion (ADME) parameters was investigated, and the mathematical models for predicting the antibacterial activity of this class of compounds were developed. The quality of the multiple linear regression (MLR) models was validated by the leave-one-out (LOO) technique, as well as by the calculation of the statistical parameters for the developed models and the results are discussed on the basis of the statistical data. The results of this study indicate that ADME parameters have a significant effect on the antibacterial activity of this class of compounds. Principal component analysis (PCA) and agglomerative hierarchical clustering algorithms (HCA) confirmed that the investigated molecules can be classified into groups on the basis of the ADME parameters: Madin-Darby Canine Kidney cell permeability (MDCK), Plasma protein binding (PPB%), human intestinal absorption (HIA%) and human colon carcinoma cell permeability (Caco-2).

Keywords: benzimidazoles, QSAR, ADME, in silico

Procedia PDF Downloads 375
3859 Analysis of Differences between Public and Experts’ Views Regarding Sustainable Development of Developing Cities: A Case Study in the Iraqi Capital Baghdad

Authors: Marwah Mohsin, Thomas Beach, Alan Kwan, Mahdi Ismail

Abstract:

This paper describes the differences in views on sustainable development between the general public and experts in a developing country, Iraq. This paper will answer the question: How do the views of the public differ from the generally accepted view of experts in the context of sustainable urban development in Iraq? In order to answer this question, the views of both the public and the experts will be analysed. These results are taken from a public survey and a Delphi questionnaire. These will be analysed using statistical methods in order to identify the significant differences. This will enable investigation of the different perceptions between the public perceptions and the experts’ views towards urban sustainable development factors. This is important due to the fact that different viewpoints between policy-makers and the public will impact on the acceptance by the public of any future sustainable development work that is undertaken. The brief findings of the statistical analysis show that the views of both the public and the experts are considered different in most of the variables except six variables show no differences. Those variables are ‘The importance of establishing sustainable cities in Iraq’, ‘Mitigate traffic congestion’, ‘Waste recycling and separating’, ‘Use wastewater recycling’, ‘Parks and green spaces’, and ‘Promote investment’.

Keywords: urban sustainability, experts views, public views, principle component analysis, PCA

Procedia PDF Downloads 127
3858 Automatic Tuning for a Systemic Model of Banking Originated Losses (SYMBOL) Tool on Multicore

Authors: Ronal Muresano, Andrea Pagano

Abstract:

Nowadays, the mathematical/statistical applications are developed with more complexity and accuracy. However, these precisions and complexities have brought as result that applications need more computational power in order to be executed faster. In this sense, the multicore environments are playing an important role to improve and to optimize the execution time of these applications. These environments allow us the inclusion of more parallelism inside the node. However, to take advantage of this parallelism is not an easy task, because we have to deal with some problems such as: cores communications, data locality, memory sizes (cache and RAM), synchronizations, data dependencies on the model, etc. These issues are becoming more important when we wish to improve the application’s performance and scalability. Hence, this paper describes an optimization method developed for Systemic Model of Banking Originated Losses (SYMBOL) tool developed by the European Commission, which is based on analyzing the application's weakness in order to exploit the advantages of the multicore. All these improvements are done in an automatic and transparent manner with the aim of improving the performance metrics of our tool. Finally, experimental evaluations show the effectiveness of our new optimized version, in which we have achieved a considerable improvement on the execution time. The time has been reduced around 96% for the best case tested, between the original serial version and the automatic parallel version.

Keywords: algorithm optimization, bank failures, OpenMP, parallel techniques, statistical tool

Procedia PDF Downloads 369
3857 Elephant Herding Optimization for Service Selection in QoS-Aware Web Service Composition

Authors: Samia Sadouki Chibani, Abdelkamel Tari

Abstract:

Web service composition combines available services to provide new functionality. Given the number of available services with similar functionalities and different non functional aspects (QoS), the problem of finding a QoS-optimal web service composition is considered as an optimization problem belonging to NP-hard class. Thus, an optimal solution cannot be found by exact algorithms within a reasonable time. In this paper, a meta-heuristic bio-inspired is presented to address the QoS aware web service composition; it is based on Elephant Herding Optimization (EHO) algorithm, which is inspired by the herding behavior of elephant group. EHO is characterized by a process of dividing and combining the population to sub populations (clan); this process allows the exchange of information between local searches to move toward a global optimum. However, with Applying others evolutionary algorithms the problem of early stagnancy in a local optimum cannot be avoided. Compared with PSO, the results of experimental evaluation show that our proposition significantly outperforms the existing algorithm with better performance of the fitness value and a fast convergence.

Keywords: bio-inspired algorithms, elephant herding optimization, QoS optimization, web service composition

Procedia PDF Downloads 327
3856 Analysis of Financial Time Series by Using Ornstein-Uhlenbeck Type Models

Authors: Md Al Masum Bhuiyan, Maria C. Mariani, Osei K. Tweneboah

Abstract:

In the present work, we develop a technique for estimating the volatility of financial time series by using stochastic differential equation. Taking the daily closing prices from developed and emergent stock markets as the basis, we argue that the incorporation of stochastic volatility into the time-varying parameter estimation significantly improves the forecasting performance via Maximum Likelihood Estimation. While using the technique, we see the long-memory behavior of data sets and one-step-ahead-predicted log-volatility with ±2 standard errors despite the variation of the observed noise from a Normal mixture distribution, because the financial data studied is not fully Gaussian. Also, the Ornstein-Uhlenbeck process followed in this work simulates well the financial time series, which aligns our estimation algorithm with large data sets due to the fact that this algorithm has good convergence properties.

Keywords: financial time series, maximum likelihood estimation, Ornstein-Uhlenbeck type models, stochastic volatility model

Procedia PDF Downloads 242
3855 Low Plastic Deformation Energy to Induce High Superficial Strain on AZ31 Magnesium Alloy Sheet

Authors: Emigdio Mendoza, Patricia Fernandez, Cristian Gomez

Abstract:

Magnesium alloys have generated great interest for several industrial applications because their high specific strength and low density make them a very attractive alternative for the manufacture of various components; however, these alloys present a limitation with their hexagonal crystal structure that limits the deformation mechanisms at room temperature likewise the molding components alternatives, it is for this reason that severe plastic deformation processes have taken a huge relevance recently because these, allow high deformation rates to be applied that induce microstructural changes where the deficiency in the sliding systems is compensated with crystallographic grains reorientations or crystal twinning. The present study reports a statistical analysis of process temperature, number of passes and shear angle with respect to the shear stress in severe plastic deformation process denominated 'Equal Channel Angular Sheet Drawing (ECASD)' applied to the magnesium alloy AZ31B through Python Statsmodels libraries, additionally a Post-Hoc range test is performed using the Tukey statistical test. Statistical results show that each variable has a p-value lower than 0.05, which allows comparing the average values of shear stresses obtained, which are in the range of 7.37 MPa to 12.23 MPa, lower values in comparison to others severe plastic deformation processes reported in the literature, considering a value of 157.53 MPa as the average creep stress for AZ31B alloy. However, a higher stress level is required when the sheets are processed using a shear angle of 150°, due to a higher level of adjustment applied for the shear die of 150°. Temperature and shear passes are important variables as well, but there is no significant impact on the level of stress applied during the ECASD process. In the processing of AZ31B magnesium alloy sheets, ECASD technique is evidenced as a viable alternative in the modification of the elasto-plastic properties of this alloy, promoting the weakening of the basal texture, which means, a better response to deformation, whereby, during the manufacture of parts by drawing or stamping processes the formation of cracks on the surface can be reduced, presenting an adequate mechanical performance.

Keywords: plastic deformation, strain, sheet drawing, magnesium

Procedia PDF Downloads 109
3854 Adaptive Decision Feedback Equalizer Utilizing Fixed-Step Error Signal for Multi-Gbps Serial Links

Authors: Alaa Abdullah Altaee

Abstract:

This paper presents an adaptive decision feedback equalizer (ADFE) for multi-Gbps serial links utilizing a fix-step error signal extracted from cross-points of received data symbols. The extracted signal is generated based on violation of received data symbols with minimum detection requirements at the clock and data recovery (CDR) stage. The iterations of the adaptation process search for the optimum feedback tap coefficients to maximize the data eye-opening and minimize the adaptation convergence time. The effectiveness of the proposed architecture is validated using the simulation results of a serial link designed in an IBM 130 nm 1.2V CMOS technology. The data link with variable channel lengths is analyzed using Spectre from Cadence Design Systems with BSIM4 device models.

Keywords: adaptive DFE, CMOS equalizer, error detection, serial links, timing jitter, wire-line communication

Procedia PDF Downloads 120
3853 Stability and Performance Improvement of a Two-Degree-of-Freedom Robot under Interaction Using the Impedance Control

Authors: Seyed Reza Mirdehghan, Mohammad Reza Haeri Yazdi

Abstract:

In this paper, the stability and the performance of a two-degree-of-freedom robot under an interaction with a unknown environment has been investigated. The time when the robot returns to its initial position after an interaction and the primary resistance of the robot against the impact must be reduced. Thus, the applied torque on the motor will be reduced. The impedance control is an appropriate method for robot control in these conditions. The stability of the robot at interaction moment was transformed to be a robust stability problem. The dynamic of the unknown environment was modeled as a weight function and the stability of the robot under an interaction with the environment has been investigated using the robust control concept. To improve the performance of the system, a force controller has been designed which the normalized impedance after interaction has been reduced. The resistance of the robot has been considered as a normalized cost function and its value was 0.593. The results has showed reduction of resistance of the robot against impact and the reduction of convergence time by lower than one second.

Keywords: impedance control, control system, robots, interaction

Procedia PDF Downloads 430
3852 Numerical Solutions of an Option Pricing Rainfall Derivatives Model

Authors: Clarinda Vitorino Nhangumbe, Ercília Sousa

Abstract:

Weather derivatives are financial products used to cover non catastrophic weather events with a weather index as the underlying asset. The rainfall weather derivative pricing model is modeled based in the assumption that the rainfall dynamics follows Ornstein-Uhlenbeck process, and the partial differential equation approach is used to derive the convection-diffusion two dimensional time dependent partial differential equation, where the spatial variables are the rainfall index and rainfall depth. To compute the approximation solutions of the partial differential equation, the appropriate boundary conditions are suggested, and an explicit numerical method is proposed in order to deal efficiently with the different choices of the coefficients involved in the equation. Being an explicit numerical method, it will be conditionally stable, then the stability region of the numerical method and the order of convergence are discussed. The model is tested for real precipitation data.

Keywords: finite differences method, ornstein-uhlenbeck process, partial differential equations approach, rainfall derivatives

Procedia PDF Downloads 105
3851 Rhythm-Reading Success Using Conversational Solfege

Authors: Kelly Jo Hollingsworth

Abstract:

Conversational Solfege, a research-based, 12-step music literacy instructional method using the sound-before-sight approach, was used to teach rhythm-reading to 128-second grade students at a public school in the southeastern United States. For each step, multiple scripted techniques are supplied to teach each skill. Unit one was the focus of this study, which is quarter note and barred eighth note rhythms. During regular weekly music instruction, students completed method steps one through five, which includes aural discrimination, decoding familiar and unfamiliar rhythm patterns, and improvising rhythmic phrases using quarter notes and barred eighth notes. Intact classes were randomly assigned to two treatment groups for teaching steps six through eight, which was the visual presentation and identification of quarter notes and barred eighth notes, visually presenting and decoding familiar patterns, and visually presenting and decoding unfamiliar patterns using said notation. For three weeks, students practiced steps six through eight during regular weekly music class. One group spent five-minutes of class time on steps six through eight technique work, while the other group spends ten-minutes of class time practicing the same techniques. A pretest and posttest were administered, and ANOVA results reveal both the five-minute (p < .001) and ten-minute group (p < .001) reached statistical significance suggesting Conversational Solfege is an efficient, effective approach to teach rhythm-reading to second grade students. After two weeks of no instruction, students were retested to measure retention. Using a repeated-measures ANOVA, both groups reached statistical significance (p < .001) on the second posttest, suggesting both the five-minute and ten-minute group retained rhythm-reading skill after two weeks of no instruction. Statistical significance was not reached between groups (p=.252), suggesting five-minutes is equally as effective as ten-minutes of rhythm-reading practice using Conversational Solfege techniques. Future research includes replicating the study with other grades and units in the text.

Keywords: conversational solfege, length of instructional time, rhythm-reading, rhythm instruction

Procedia PDF Downloads 157
3850 Minimum Pension Guarantee in Funded Pension Schemes: Theoretical Model and Global Implementation

Authors: Ishay Wolf

Abstract:

In this study, the financial position of pension actors in the market during the pension system transition toward a more funded capitalized scheme is explored, mainly via an option benefit model. This is enabled by not considering the economy as a single earning cohort. We analytically demonstrate a socio-economic anomaly in the funded pension system, which is in favor of high earning cohorts on at the expense of low earning cohorts. This anomaly is realized by a lack of insurance and exposure to financial and systemic risks. Furthermore, the anomaly might lead to pension re-reform back to unfunded scheme, mostly due to political pressure. We find that a minimum pension guarantee is a rebalance mechanism to this anomaly, which increases the probability to of the sustainable pension scheme. Specifically, we argue that implementing the guarantee with an intra-generational, risk-sharing mechanism is the most efficient way to reduce the effect of this abnormality. Moreover, we exhibit the convergence process toward implementing minimum pension guarantee in many countries which have capitalized their pension systems during the last three decades, particularly among Latin America and CEE countries.

Keywords: benefits, pension scheme, put option, social security

Procedia PDF Downloads 122
3849 Investigating the Relationship Between Corporate Governance and Financial Performance Considering the Moderating Role of Opinion and Internal Control Weakness

Authors: Fatemeh Norouzi

Abstract:

Today, financial performance has become one of the important issues in accounting and auditing that companies and their managers have paid attention to this issue and for this reason to the variables that are influential in this field. One of the things that can affect financial performance is corporate governance, which is examined in this research, although some things such as issues related to auditing can also moderate this relationship; Therefore, this research has been conducted with the aim of investigating the relationship between corporate governance and financial performance with regard to the moderating role of feedback and internal control weakness. The research is practical in terms of purpose, and in terms of method, it has been done in a post-event descriptive manner, in which the data has been analyzed using stock market data. Data collection has been done by using stock exchange data which has been extracted from the website of the Iraqi Stock Exchange, the statistical population of this research is all the companies admitted to the Iraqi Stock Exchange. . The statistical sample in this research is considered from 2014 to 2021, which includes 34 companies. Four different models have been considered for the research hypotheses, which are eight hypotheses, in this research, the analysis has been done using EXCEL and STATA15 software. In this article, collinearity test, integration test ,determination of fixed effects and correlation matrix results, have been used. The research results showed that the first four hypotheses were rejected and the second four hypotheses were confirmed.

Keywords: size of the board of directors, duality of the CEO, financial performance, internal control weakness

Procedia PDF Downloads 88
3848 A Comparative Study to Evaluate Chronological Age and Dental Age in the North Indian Population Using Cameriere's Method

Authors: Ranjitkumar Patil

Abstract:

Age estimation has importance in forensic dentistry. Dental age estimation has emerged as an alternative to skeletal age determination. The methods based on stages of tooth formation, as appreciated on radiographs, seem to be more appropriate in the assessment of age than those based on skeletal development. The study was done to evaluate dental age in the north Indian population using Cameriere’s method. Aims/Objectives: The study was conducted to assess the dental age of North Indian children using Cameriere’s method and to compare the chronological age and dental age for validation of the Cameriere’s method in the north Indian population. A comparative study of 02-year duration on the OPG (using PLANMECA Promax 3D) data of 497 individuals with ages ranging from 5 to 15 years was done based on simple random technique ethical approval obtained from institutional ethical committee. The data was obtained based on inclusion and exclusion criteria and was analyzed by software for dental age estimation. Statistical analysis: The student’s t-test was used to compare the morphological variables of males with those of females and to compare observed age with estimated age. The regression formula was also calculated. Results: Present study was a comparative study of 497 subjects with a distribution between males and females, with their dental age assessed by using a Panoramic radiograph, following the method described by Cameriere, which is widely accepted. Statistical analysis in our study indicated that gender does not have a significant influence on age estimation. (R2= 0.787). Conclusion: This infers that Cameriere’s method can be effectively applied to the north Indian population.

Keywords: forensic, dental age, skeletal age, chronological age, Cameriere’s method

Procedia PDF Downloads 115
3847 Web-Based Alcohol Prevention among Iranian Medical University Students: A Randomized Control Trail

Authors: Farzad Jalilian, Mehdi Mirzaei Alavijeh

Abstract:

Background: E-interventions as a universal approach to prevent a high-risk behavior, such as alcohol drinking. This study was conducted to evaluate web-based alcohol drinking preventative intervention efficiency among medical university students in Iran. Methods: Overall, 150 freshman and sophomore male student’s college students participated in this study as intervention and control group. This was a longitudinal randomized pre- and post-test series control group design panel study to implement a behavior modification based intervention to alcohol drinking prevention among college students. Cross-tabulation, t-test, repeated measures, and GEE by using SPSS statistical package, version 21 was used for the statistical analysis. The participants were followed up for 6 months with data collection scheduled at baseline, 3 and 6 months. The primary outcomes are attitude, self-control, and sensation seeking. Furthermore, the secondary outcome is comparing alcohol drinking among the study groups. Results: It was found significant reduce in average response for an attitude towards alcohol drinking and sensation seeking among intervention group (P < 0.05). But after intervention not significant difference between intervention and control group of improve self-control and reduce alcohol drinking (P > 0.05). Conclusion: Our intervention has been accompanied with reducing alcohol use rate. These findings indicate that e-intervention may be effectiveness approach to address the alcohol prevention among college students.

Keywords: e-interventions, alcohol drinking, students, Iran

Procedia PDF Downloads 414
3846 An Exploratory Study of Vocational High School Students’ Needs in Learning English

Authors: Yi-Hsuan Gloria Lo

Abstract:

The educational objective of vocational high schools (VHSs) is to equip VHS students with practical skills and knowledge that can be applied in the job-related market. However, with the increasing number of technological universities over the past two decades, the majority of VHS students have chosen to receive higher education rather than enter the job market. VHS English education has been confronting a dilemma: Should an English for specific purposes (ESP) approach, which aligns with the educational goal of VHS education, be taken or should an English for general purposes (EGP) approach, which prepares VHS students for advanced studies in universities, be followed? While ESP theorists proposed that that ESP can be taught to secondary learners, little was known about VHS students’ perspective on this ESP-versus-EGP dilemma. Scant research has investigated different facets of students’ needs (necessities, wants, and lacks) for both ESP and EGP in terms of the four language skills and the factors that contribute to any differences. To address the gap in the literature, 100 VHS students responded to statements related to their necessities, wants, and lacks in learning ESP and EGP on a 6-point Likert scale. Six VHS students were interviewed to tap into the reasons for different facets of the needs for learning EGP and ESP. The statistical analysis indicates that at this stage of learning English, VHS subjects believed that EGP was more necessary than ESP; EGP was more desirable than ESP. However, they reported that they were more lacking in ESP than in EGP learning. Regarding EGP, the results show that the VHS subjects rated speaking as their most necessary skill, speaking as the most desirable skill, and writing as the most lacking skill. A significant difference was found between perceived learning necessities and lacks and between perceived wants and lacks. No statistical difference was found between necessities and wants. In the aspect of ESP, the results indicate that the VHS subjects marked reading as their most necessary skill, speaking as the most desirable skill, and writing as the most lacking skill. A significant difference exists between their perceived necessities and lacks and between their wants and lacks. However, there is no statistically significant difference between their perceived lacks and wants. Despite the lack of a significant difference between learning necessities and wants, the qualitative interview data reveal that the reasons for their perceived necessities and wants were different. The findings of the study confirm previous research that demonstrates that ‘needs’ is a multiple and conflicting construct. What VHS students felt most lacking was not necessarily what they believed they should learn or would like to learn. Although no statistical difference was found, different reasons were attributed to their perceived necessities and wants. Both theoretical and practical implications have been drawn and discussed for ESP research in general and teaching ESP in VHSs in particular.

Keywords: vocational high schools (VHSs), English for General Purposes (EGP), English for Specific Purposes (ESP), needs analysis

Procedia PDF Downloads 171
3845 Evaluation the Financial and Social Efficiency of Microfinance Institutions Using Data Envelope Analysis - A Sample Study of Active Microfinance Institutions in India

Authors: Hiba Mezaache

Abstract:

The study aims to assess the financial and social efficiency of microfinance institutions in india for the period 2015-2019 by using two models of economies of scale and choosing the output direction of the data envelope analysis (DEA) method and using the MIX MARKET database. The study concluded that microfinance institutions focus on achieving financial efficiency beyond their focus on achieving social efficiency to ensure their continuity in the market. Convergence in the efficiency ratios that have been achieved, but the optimum ratios have been achieved under the changing economies of scale; Efficiency is affected by the depth of reaching low-income groups, as serving this group raises costs and risks. The importance of lending to women in rural areas and raising their awareness to ensure their financial and social empowerment; Make improvements in operating expenses, asset management, and loan personnel control in order to maximize output.

Keywords: microfinance, financial efficiency, social efficiency, mix market, microfinance institutions

Procedia PDF Downloads 157
3844 The Modification of the Mixed Flow Pump with Respect to Stability of the Head Curve

Authors: Roman Klas, František Pochylý, Pavel Rudolf

Abstract:

This paper is focused on the CFD simulation of the radiaxial pump (i.e. mixed flow pump) with the aim to detect the reasons of Y-Q characteristic instability. The main reasons of pressure pulsations were detected by means of the analysis of velocity and pressure fields within the pump combined with the theoretical approach. Consequently, the modifications of spiral case and pump suction area were made based on the knowledge of flow conditions and the shape of dissipation function. The primary design of pump geometry was created as the base model serving for the comparison of individual modification influences. The basic experimental data are available for this geometry. This approach replaced the more complicated and with respect to convergence of all computational tasks more difficult calculation for the compressible liquid flow. The modification of primary pump consisted in inserting the three fins types. Subsequently, the evaluation of pressure pulsations, specific energy curves and visualization of velocity fields were chosen as the criterion for successful design.

Keywords: CFD, radiaxial pump, spiral case, stability

Procedia PDF Downloads 397
3843 Effects of Process Parameters on the Yield of Oil from Coconut Fruit

Authors: Ndidi F. Amulu, Godian O. Mbah, Maxwel I. Onyiah, Callistus N. Ude

Abstract:

Analysis of the properties of coconut (Cocos nucifera) and its oil was evaluated in this work using standard analytical techniques. The analyses carried out include proximate composition of the fruit, extraction of oil from the fruit using different process parameters and physicochemical analysis of the extracted oil. The results showed the percentage (%) moisture, crude lipid, crude protein, ash, and carbohydrate content of the coconut as 7.59, 55.15, 5.65, 7.35, and 19.51 respectively. The oil from the coconut fruit was odourless and yellowish liquid at room temperature (30oC). The treatment combinations used (leaching time, leaching temperature and solute: solvent ratio) showed significant differences (P˂0.05) in the yield of oil from coconut flour. The oil yield ranged between 36.25%-49.83%. Lipid indices of the coconut oil indicated the acid value (AV) as 10.05 Na0H/g of oil, free fatty acid (FFA) as 5.03%, saponification values (SV) as 183.26 mgKOH-1 g of oil, iodine value (IV) as 81.00 I2/g of oil, peroxide value (PV) as 5.00 ml/ g of oil and viscosity (V) as 0.002. A standard statistical package minitab version 16.0 program was used in the regression analysis and analysis of variance (ANOVA). The statistical software mentioned above was also used to generate various plots such as single effect plot, interactions effect plot and contour plot. The response or yield of oil from the coconut flour was used to develop a mathematical model that correlates the yield to the process variables studied. The maximum conditions obtained that gave the highest yield of coconut oil were leaching time of 2 hrs, leaching temperature of 50 oC and solute/solvent ratio of 0.05 g/ml.

Keywords: coconut, oil-extraction, optimization, physicochemical, proximate

Procedia PDF Downloads 353