Search results for: statistical arbitrage
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4012

Search results for: statistical arbitrage

3952 GPS Refinement in Cities Using Statistical Approach

Authors: Ashwani Kumar

Abstract:

GPS plays an important role in everyday life for safe and convenient transportation. While pedestrians use hand held devices to know their position in a city, vehicles in intelligent transport systems use relatively sophisticated GPS receivers for estimating their current position. However, in urban areas where the GPS satellites are occluded by tall buildings, trees and reflections of GPS signals from nearby vehicles, GPS position estimation becomes poor. In this work, an exhaustive GPS data is collected at a single point in urban area under different times of day and under dynamic environmental conditions. The data is analyzed and statistical refinement methods are used to obtain optimal position estimate among all the measured positions. The results obtained are compared with publically available datasets and obtained position estimation refinement results are promising.

Keywords: global positioning system, statistical approach, intelligent transport systems, least squares estimation

Procedia PDF Downloads 288
3951 Application of Electrical Resistivity, Induced Polarization and Statistical Methods in Chichak Iron Deposit Exploration

Authors: Shahrzad Maghsoodi, Hamid Reza Ranazi

Abstract:

This paper is devoted to exploration of Chichak (hematite) deposit, using electrical resistivity, chargeability and statistical methods. Chichak hematite deposit is located in Chichak area west Azarbaijan, northwest of Iran. There are some outcrops of hematite bodies in the area. The goal of this study was to identify the depth, thickness and shape of these bodies and to explore other probabile hematite bodies. Therefore nine profiles were considered to be surveyed by RS and IP method by utilizing an innovative electrode array so called CRSP (Combined Resistivity Sounding and Profiling). IP and RS sections were completed along each profile. In addition, the RS and IP data were analyzed and relation between these two variables was determined by statistical tools. Finally, hematite bodies were identified in each of the sections. The results showed that hematite bodies have a resistivity lower than 125 Ωm and very low chargeability, lower than 8 mV⁄V. After geophysical study some points were proposed for drilling, results obtained from drilling confirm the geophysical results.

Keywords: Hematite deposit, Iron exploration, Electrical resistivity, Chargeability, Iran, Chichak, Statistical, CRSP electrodes array

Procedia PDF Downloads 78
3950 Forecasting the Influences of Information and Communication Technology on the Structural Changes of Japanese Industrial Sectors: A Study Using Statistical Analysis

Authors: Ubaidillah Zuhdi, Shunsuke Mori, Kazuhisa Kamegai

Abstract:

The purpose of this study is to forecast the influences of Information and Communication Technology (ICT) on the structural changes of Japanese economies based on Leontief Input-Output (IO) coefficients. This study establishes a statistical analysis to predict the future interrelationships among industries. We employ the Constrained Multivariate Regression (CMR) model to analyze the historical changes of input-output coefficients. Statistical significance of the model is then tested by Likelihood Ratio Test (LRT). In our model, ICT is represented by two explanatory variables, i.e. computers (including main parts and accessories) and telecommunications equipment. A previous study, which analyzed the influences of these variables on the structural changes of Japanese industrial sectors from 1985-2005, concluded that these variables had significant influences on the changes in the business circumstances of Japanese commerce, business services and office supplies, and personal services sectors. The projected future Japanese economic structure based on the above forecast generates the differentiated direct and indirect outcomes of ICT penetration.

Keywords: forecast, ICT, industrial structural changes, statistical analysis

Procedia PDF Downloads 375
3949 An Adjusted Network Information Criterion for Model Selection in Statistical Neural Network Models

Authors: Christopher Godwin Udomboso, Angela Unna Chukwu, Isaac Kwame Dontwi

Abstract:

In selecting a Statistical Neural Network model, the Network Information Criterion (NIC) has been observed to be sample biased, because it does not account for sample sizes. The selection of a model from a set of fitted candidate models requires objective data-driven criteria. In this paper, we derived and investigated the Adjusted Network Information Criterion (ANIC), based on Kullback’s symmetric divergence, which has been designed to be an asymptotically unbiased estimator of the expected Kullback-Leibler information of a fitted model. The analyses show that on a general note, the ANIC improves model selection in more sample sizes than does the NIC.

Keywords: statistical neural network, network information criterion, adjusted network, information criterion, transfer function

Procedia PDF Downloads 567
3948 Investigating Visual Statistical Learning during Aging Using the Eye-Tracking Method

Authors: Zahra Kazemi Saleh, Bénédicte Poulin-Charronnat, Annie Vinter

Abstract:

This study examines the effects of aging on visual statistical learning, using eye-tracking techniques to investigate this cognitive phenomenon. Visual statistical learning is a fundamental brain function that enables the automatic and implicit recognition, processing, and internalization of environmental patterns over time. Some previous research has suggested the robustness of this learning mechanism throughout the aging process, underscoring its importance in the context of education and rehabilitation for the elderly. The study included three distinct groups of participants, including 21 young adults (Mage: 19.73), 20 young-old adults (Mage: 67.22), and 17 old-old adults (Mage: 79.34). Participants were exposed to a series of 12 arbitrary black shapes organized into 6 pairs, each with different spatial configurations and orientations (horizontal, vertical, and oblique). These pairs were not explicitly revealed to the participants, who were instructed to passively observe 144 grids presented sequentially on the screen for a total duration of 7 min. In the subsequent test phase, participants performed a two-alternative forced-choice task in which they had to identify the most familiar pair from 48 trials, each consisting of a base pair and a non-base pair. Behavioral analysis using t-tests revealed notable findings. The mean score for the first group was significantly above chance, indicating the presence of visual statistical learning. Similarly, the second group also performed significantly above chance, confirming the persistence of visual statistical learning in young-old adults. Conversely, the third group, consisting of old-old adults, showed a mean score that was not significantly above chance. This lack of statistical learning in the old-old adult group suggests a decline in this cognitive ability with age. Preliminary eye-tracking results showed a decrease in the number and duration of fixations during the exposure phase for all groups. The main difference was that older participants focused more often on empty cases than younger participants, likely due to a decline in the ability to ignore irrelevant information, resulting in a decrease in statistical learning performance.

Keywords: aging, eye tracking, implicit learning, visual statistical learning

Procedia PDF Downloads 77
3947 Employer Learning, Statistical Discrimination and University Prestige

Authors: Paola Bordon, Breno Braga

Abstract:

This paper investigates whether firms use university prestige to statistically discriminate among college graduates. The test is based on the employer learning literature which suggests that if firms use a characteristic for statistical discrimination, this variable should become less important for earnings as a worker gains labor market experience. In this framework, we use a regression discontinuity design to estimate a 19% wage premium for recent graduates of two of the most selective universities in Chile. However, we find that this premium decreases by 3 percentage points per year of labor market experience. These results suggest that employers use college selectivity as a signal of workers' quality when they leave school. However, as workers reveal their productivity throughout their careers, they become rewarded based on their true quality rather than the prestige of their college.

Keywords: employer learning, statistical discrimination, college returns, college selectivity

Procedia PDF Downloads 580
3946 Statistical Manufacturing Cell/Process Qualification Sample Size Optimization

Authors: Angad Arora

Abstract:

In production operations/manufacturing, a cell or line is typically a bunch of similar machines (computer numerical control (CNCs), advanced cutting, 3D printing or special purpose machines. For qualifying a typical manufacturing line /cell / new process, Ideally, we need a sample of parts that can be flown through the process and then we make a judgment on the health of the line/cell. However, with huge volumes and mass production scope, such as in the mobile phone industry, for example, the actual cells or lines can go in thousands and to qualify each one of them with statistical confidence means utilizing samples that are very large and eventually add to product /manufacturing cost + huge waste if the parts are not intended to be customer shipped. To solve this, we come up with 2 steps statistical approach. We start with a small sample size and then objectively evaluate whether the process needs additional samples or not. For example, if a process is producing bad parts and we saw those samples early, then there is a high chance that the process will not meet the desired yield and there is no point in keeping adding more samples. We used this hypothesis and came up with 2 steps binomial testing approach. Further, we also prove through results that we can achieve an 18-25% reduction in samples while keeping the same statistical confidence.

Keywords: statistics, data science, manufacturing process qualification, production planning

Procedia PDF Downloads 96
3945 An Approach Based on Statistics and Multi-Resolution Representation to Classify Mammograms

Authors: Nebi Gedik

Abstract:

One of the significant and continual public health problems in the world is breast cancer. Early detection is very important to fight the disease, and mammography has been one of the most common and reliable methods to detect the disease in the early stages. However, it is a difficult task, and computer-aided diagnosis (CAD) systems are needed to assist radiologists in providing both accurate and uniform evaluation for mass in mammograms. In this study, a multiresolution statistical method to classify mammograms as normal and abnormal in digitized mammograms is used to construct a CAD system. The mammogram images are represented by wave atom transform, and this representation is made by certain groups of coefficients, independently. The CAD system is designed by calculating some statistical features using each group of coefficients. The classification is performed by using support vector machine (SVM).

Keywords: wave atom transform, statistical features, multi-resolution representation, mammogram

Procedia PDF Downloads 222
3944 The Metacognition Levels of Students: A Research School of Physical Education and Sports at Anadolu University

Authors: Dilek Yalız Solmaz

Abstract:

Meta-cognition is an important factor for educating conscious individuals who are aware of their cognitive processes. With this respect, the purposes of this article is to find out the perceived metacognition level of Physical Education and Sports School students at Anadolu University and to identify whether metacognition levels display significant differences in terms of various variables. 416 Anadolu University Physical Education and Sports School students were formed the research universe. "The Meta-Cognitions Questionnaire (MCQ-30)" developed by Cartwright-Hatton and Wells and later developed the 30-item short form (MCQ-30) was used. The MCQ-30 which was adapted into Turkish by Tosun and Irak is a four-point agreement scale. In the data analysis, arithmethic mean, standard deviation, t-test and ANOVA were used. There is no statistical difference between mean scores of uncontrollableness and danger, cognitive awareness, cognitive confidence and the positive beliefs of girls and boys students. There is a statistical difference between mean scores of the need to control thinking. There is no statistical difference according to departments of students between mean scores of uncontrollableness and danger, cognitive awareness, cognitive confidence, need to control thinking and the positive beliefs. There is no statistical difference according to grade level of students between mean scores of the positive beliefs, cognitive confidence and need to control thinking. There is a statistical difference between mean scores of uncontrollableness and danger and cognitive awareness.

Keywords: meta cognition, physical education, sports school students, thinking

Procedia PDF Downloads 383
3943 Exploring the Spatial Characteristics of Mortality Map: A Statistical Area Perspective

Authors: Jung-Hong Hong, Jing-Cen Yang, Cai-Yu Ou

Abstract:

The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.

Keywords: mortality map, spatial patterns, statistical area, variation

Procedia PDF Downloads 258
3942 Direct Translation vs. Pivot Language Translation for Persian-Spanish Low-Resourced Statistical Machine Translation System

Authors: Benyamin Ahmadnia, Javier Serrano

Abstract:

In this paper we compare two different approaches for translating from Persian to Spanish, as a language pair with scarce parallel corpus. The first approach involves direct transfer using an statistical machine translation system, which is available for this language pair. The second approach involves translation through English, as a pivot language, which has more translation resources and more advanced translation systems available. The results show that, it is possible to achieve better translation quality using English as a pivot language in either approach outperforms direct translation from Persian to Spanish. Our best result is the pivot system which scores higher than direct translation by (1.12) BLEU points.

Keywords: statistical machine translation, direct translation approach, pivot language translation approach, parallel corpus

Procedia PDF Downloads 487
3941 Analysis of the Engineering Judgement Influence on the Selection of Geotechnical Parameters Characteristic Values

Authors: K. Ivandic, F. Dodigovic, D. Stuhec, S. Strelec

Abstract:

A characteristic value of certain geotechnical parameter results from an engineering assessment. Its selection has to be based on technical principles and standards of engineering practice. It has been shown that the results of engineering assessment of different authors for the same problem and input data are significantly dispersed. A survey was conducted in which participants had to estimate the force that causes a 10 cm displacement at the top of a axially in-situ compressed pile. Fifty experts from all over the world took part in it. The lowest estimated force value was 42% and the highest was 133% of measured force resulting from a mentioned static pile load test. These extreme values result in significantly different technical solutions to the same engineering task. In case of selecting a characteristic value of a geotechnical parameter the importance of the influence of an engineering assessment can be reduced by using statistical methods. An informative annex of Eurocode 1 prescribes the method of selecting the characteristic values of material properties. This is followed by Eurocode 7 with certain specificities linked to selecting characteristic values of geotechnical parameters. The paper shows the procedure of selecting characteristic values of a geotechnical parameter by using a statistical method with different initial conditions. The aim of the paper is to quantify an engineering assessment in the example of determining a characteristic value of a specific geotechnical parameter. It is assumed that this assessment is a random variable and that its statistical features will be determined. For this purpose, a survey research was conducted among relevant experts from the field of geotechnical engineering. Conclusively, the results of the survey and the application of statistical method were compared.

Keywords: characteristic values, engineering judgement, Eurocode 7, statistical methods

Procedia PDF Downloads 296
3940 Radar Signal Detection Using Neural Networks in Log-Normal Clutter for Multiple Targets Situations

Authors: Boudemagh Naime

Abstract:

Automatic radar detection requires some methods of adapting to variations in the background clutter in order to control their false alarm rate. The problem becomes more complicated in non-Gaussian environment. In fact, the conventional approach in real time applications requires a complex statistical modeling and much computational operations. To overcome these constraints, we propose another approach based on artificial neural network (ANN-CMLD-CFAR) using a Back Propagation (BP) training algorithm. The considered environment follows a log-normal distribution in the presence of multiple Rayleigh-targets. To evaluate the performances of the considered detector, several situations, such as scale parameter and the number of interferes targets, have been investigated. The simulation results show that the ANN-CMLD-CFAR processor outperforms the conventional statistical one.

Keywords: radat detection, ANN-CMLD-CFAR, log-normal clutter, statistical modelling

Procedia PDF Downloads 364
3939 Statistical Randomness Testing of Some Second Round Candidate Algorithms of CAESAR Competition

Authors: Fatih Sulak, Betül A. Özdemir, Beyza Bozdemir

Abstract:

In order to improve symmetric key research, several competitions had been arranged by organizations like National Institute of Standards and Technology (NIST) and International Association for Cryptologic Research (IACR). In recent years, the importance of authenticated encryption has rapidly increased because of the necessity of simultaneously enabling integrity, confidentiality and authenticity. Therefore, at January 2013, IACR announced the Competition for Authenticated Encryption: Security, Applicability, and Robustness (CAESAR Competition) which will select secure and efficient algorithms for authenticated encryption. Cryptographic algorithms are anticipated to behave like random mappings; hence, it is important to apply statistical randomness tests to the outputs of the algorithms. In this work, the statistical randomness tests in the NIST Test Suite and the other recently designed randomness tests are applied to six second round algorithms of the CAESAR Competition. It is observed that AEGIS achieves randomness after 3 rounds, Ascon permutation function achieves randomness after 1 round, Joltik encryption function achieves randomness after 9 rounds, Morus state update function achieves randomness after 3 rounds, Pi-cipher achieves randomness after 1 round, and Tiaoxin achieves randomness after 1 round.

Keywords: authenticated encryption, CAESAR competition, NIST test suite, statistical randomness tests

Procedia PDF Downloads 315
3938 Statistical Characteristics of Distribution of Radiation-Induced Defects under Random Generation

Authors: P. Selyshchev

Abstract:

We consider fluctuations of defects density taking into account their interaction. Stochastic field of displacement generation rate gives random defect distribution. We determinate statistical characteristics (mean and dispersion) of random field of point defect distribution as function of defect generation parameters, temperature and properties of irradiated crystal.

Keywords: irradiation, primary defects, interaction, fluctuations

Procedia PDF Downloads 343
3937 Advances in Artificial intelligence Using Speech Recognition

Authors: Khaled M. Alhawiti

Abstract:

This research study aims to present a retrospective study about speech recognition systems and artificial intelligence. Speech recognition has become one of the widely used technologies, as it offers great opportunity to interact and communicate with automated machines. Precisely, it can be affirmed that speech recognition facilitates its users and helps them to perform their daily routine tasks, in a more convenient and effective manner. This research intends to present the illustration of recent technological advancements, which are associated with artificial intelligence. Recent researches have revealed the fact that speech recognition is found to be the utmost issue, which affects the decoding of speech. In order to overcome these issues, different statistical models were developed by the researchers. Some of the most prominent statistical models include acoustic model (AM), language model (LM), lexicon model, and hidden Markov models (HMM). The research will help in understanding all of these statistical models of speech recognition. Researchers have also formulated different decoding methods, which are being utilized for realistic decoding tasks and constrained artificial languages. These decoding methods include pattern recognition, acoustic phonetic, and artificial intelligence. It has been recognized that artificial intelligence is the most efficient and reliable methods, which are being used in speech recognition.

Keywords: speech recognition, acoustic phonetic, artificial intelligence, hidden markov models (HMM), statistical models of speech recognition, human machine performance

Procedia PDF Downloads 478
3936 The Impact of Innovations in Human Resource Practices, Innovation Capabilities and Competitive Advantage on Company Performance

Authors: Bita Kharazi

Abstract:

The purpose of this research was to investigate the impact of innovations in human resource practices, innovation capabilities, and competitive advantage on company performance. This research was applied in terms of purpose and in terms of method, it was descriptive research of correlation type. The statistical population of this research was all the employees of Zar Industrial and Research Group. The sampling method was available in this research, and Cochran's formula was used to determine the statistical sample size. A standard questionnaire was used to collect information in this research, and SPSS software and simultaneous regression statistical tests were used to analyze the data. Based on the findings of the present research, it was found that the components of creativity in human resource practices, innovation capability, and competitive advantage have a significant impact on the company's performance.

Keywords: human resource management, innovation, competitive advantage, company performance

Procedia PDF Downloads 18
3935 Statistical Feature Extraction Method for Wood Species Recognition System

Authors: Mohd Iz'aan Paiz Bin Zamri, Anis Salwa Mohd Khairuddin, Norrima Mokhtar, Rubiyah Yusof

Abstract:

Effective statistical feature extraction and classification are important in image-based automatic inspection and analysis. An automatic wood species recognition system is designed to perform wood inspection at custom checkpoints to avoid mislabeling of timber which will results to loss of income to the timber industry. The system focuses on analyzing the statistical pores properties of the wood images. This paper proposed a fuzzy-based feature extractor which mimics the experts’ knowledge on wood texture to extract the properties of pores distribution from the wood surface texture. The proposed feature extractor consists of two steps namely pores extraction and fuzzy pores management. The total number of statistical features extracted from each wood image is 38 features. Then, a backpropagation neural network is used to classify the wood species based on the statistical features. A comprehensive set of experiments on a database composed of 5200 macroscopic images from 52 tropical wood species was used to evaluate the performance of the proposed feature extractor. The advantage of the proposed feature extraction technique is that it mimics the experts’ interpretation on wood texture which allows human involvement when analyzing the wood texture. Experimental results show the efficiency of the proposed method.

Keywords: classification, feature extraction, fuzzy, inspection system, image analysis, macroscopic images

Procedia PDF Downloads 426
3934 Research on Transmission Parameters Determination Method Based on Dynamic Characteristic Analysis

Authors: Baoshan Huang, Fanbiao Bao, Bing Li, Lianghua Zeng, Yi Zheng

Abstract:

Parameter control strategy based on statistical characteristics can analyze the choice of the transmission ratio of an automobile transmission. According to the difference of the transmission gear, the number and spacing of the gear can be determined. Transmission ratio distribution of transmission needs to satisfy certain distribution law. According to the statistic characteristics of driving parameters, the shift control strategy of the vehicle is analyzed. CVT shift schedule adjustment algorithm based on statistical characteristic parameters can be seen from the above analysis, if according to the certain algorithm to adjust the size of, can adjust the target point are in the best efficiency curve and dynamic curve between the location, to alter the vehicle characteristics. Based on the dynamic characteristics and the practical application of the vehicle, this paper presents the setting scheme of the transmission ratio.

Keywords: vehicle dynamics, transmission ratio, transmission parameters, statistical characteristics

Procedia PDF Downloads 404
3933 Reduction in Hot Metal Silicon through Statistical Analysis at G-Blast Furnace, Tata Steel Jamshedpur

Authors: Shoumodip Roy, Ankit Singhania, Santanu Mallick, Abhiram Jha, M. K. Agarwal, R. V. Ramna, Uttam Singh

Abstract:

The quality of hot metal at any blast furnace is judged by the silicon content in it. Lower hot metal silicon not only enhances process efficiency at steel melting shops but also reduces hot metal costs. The Hot metal produced at G-Blast furnace Tata Steel Jamshedpur has a significantly higher Si content than Benchmark Blast furnaces. The higher content of hot metal Si is mainly due to inferior raw material quality than those used in benchmark blast furnaces. With minimum control over raw material quality, the only option left to control hot metal Si is via optimizing the furnace parameters. Therefore, in order to identify the levers to reduce hot metal Si, Data mining was carried out, and multiple regression models were developed. The statistical analysis revealed that Slag B3{(CaO+MgO)/SiO2}, Slag Alumina and Hot metal temperature are key controllable parameters affecting hot metal silicon. Contour Plots were used to determine the optimum range of levels identified through statistical analysis. A trial plan was formulated to operate relevant parameters, at G blast furnace, in the identified range to reduce hot metal silicon. This paper details out the process followed and subsequent reduction in hot metal silicon by 15% at G blast furnace.

Keywords: blast furnace, optimization, silicon, statistical tools

Procedia PDF Downloads 223
3932 A Proposed Algorithm for Obtaining the Map of Subscribers’ Density Distribution for a Mobile Wireless Communication Network

Authors: C. Temaneh-Nyah, F. A. Phiri, D. Karegeya

Abstract:

This paper presents an algorithm for obtaining the map of subscriber’s density distribution for a mobile wireless communication network based on the actual subscriber's traffic data obtained from the base station. This is useful in statistical characterization of the mobile wireless network.

Keywords: electromagnetic compatibility, statistical analysis, simulation of communication network, subscriber density

Procedia PDF Downloads 309
3931 Optimal Portfolio of Multi-service Provision based on Stochastic Model Predictive Control

Authors: Yifu Ding, Vijay Avinash, Malcolm McCulloch

Abstract:

As the proliferation of decentralized energy systems, the UK power system allows small-scale entities such as microgrids (MGs) to tender multiple energy services including energy arbitrage and frequency responses (FRs). However, its operation requires the balance between the uncertain renewable generations and loads in real-time and has to fulfill their provision requirements of contract services continuously during the time window agreed, otherwise it will be penalized for the under-delivered provision. To hedge against risks due to uncertainties and maximize the economic benefits, we propose a stochastic model predictive control (SMPC) framework to optimize its operation for the multi-service provision. Distinguished from previous works, we include a detailed economic-degradation model of the lithium-ion battery to quantify the costs of different service provisions, as well as accurately describe the changing dynamics of the battery. Considering a branch of load and generation scenarios and the battery aging, we formulate a risk-averse cost function using conditional value at risk (CVaR). It aims to achieve the maximum expected net revenue and avoids severe losses. The framework will be performed on a case study of a PV-battery grid-tied microgrid in the UK with real-life data. To highlight its performance, the framework will be compared with the case without the degradation model and the deterministic formulation.

Keywords: model predictive control (MPC), battery degradation, frequency response, microgrids

Procedia PDF Downloads 124
3930 Statistical Analysis of Cables in Long-Span Cable-Stayed Bridges

Authors: Ceshi Sun, Yueyu Zhao, Yaobing Zhao, Zhiqiang Wang, Jian Peng, Pengxin Guo

Abstract:

With the rapid development of transportation, there are more than 100 cable-stayed bridges with main span larger than 300 m in China. In order to ascertain the statistical relationships among the design parameters of stay cables and their distribution characteristics, 1500 cables were selected from 25 practical long-span cable-stayed bridges. A new relationship between the first order frequency and the length of cable was found by conducting the curve fitting. Then, based on this relationship other interesting relationships were deduced. Several probability density functions (PDFs) were used to investigate the distributions of the parameters of first order frequency, stress level and the Irvine parameter. It was found that these parameters obey the Lognormal distribution, the Weibull distribution and the generalized Pareto distribution, respectively. Scatter diagrams of the three parameters were plotted and their 95% confidence intervals were also investigated.

Keywords: cable, cable-stayed bridge, long-span, statistical analysis

Procedia PDF Downloads 634
3929 Investigation of the Main Trends of Tourist Expenses in Georgia

Authors: Nino Abesadze, Marine Mindorashvili, Nino Paresashvili

Abstract:

The main purpose of the article is to make complex statistical analysis of tourist expenses of foreign visitors. We used mixed technique of selection that implies rules of random and proportional selection. Computer software SPSS was used to compute statistical data for corresponding analysis. Corresponding methodology of tourism statistics was implemented according to international standards. Important information was collected and grouped from the major Georgian airports. Techniques of statistical observation were prepared. A representative population of foreign visitors and a rule of selection of respondents were determined. We have a trend of growth of tourist numbers and share of tourists from post-soviet countries constantly increases. Level of satisfaction with tourist facilities and quality of service has grown, but still we have a problem of disparity between quality of service and prices. The design of tourist expenses of foreign visitors is diverse; competitiveness of tourist products of Georgian tourist companies is higher.

Keywords: tourist, expenses, methods, statistics, analysis

Procedia PDF Downloads 337
3928 Immobilization of Lipase Enzyme by Low Cost Material: A Statistical Approach

Authors: Md. Z. Alam, Devi R. Asih, Md. N. Salleh

Abstract:

Immobilization of lipase enzyme produced from palm oil mill effluent (POME) by the activated carbon (AC) among the low cost support materials was optimized. The results indicated that immobilization of 94% was achieved by AC as the most suitable support material. A sequential optimization strategy based on a statistical experimental design, including one-factor-at-a-time (OFAT) method was used to determine the equilibrium time. Three components influencing lipase immobilization were optimized by the response surface methodology (RSM) based on the face-centered central composite design (FCCCD). On the statistical analysis of the results, the optimum enzyme concentration loading, agitation rate and carbon active dosage were found to be 30 U/ml, 300 rpm and 8 g/L respectively, with a maximum immobilization activity of 3732.9 U/g-AC after 2 hrs of immobilization. Analysis of variance (ANOVA) showed a high regression coefficient (R2) of 0.999, which indicated a satisfactory fit of the model with the experimental data. The parameters were statistically significant at p<0.05.

Keywords: activated carbon, POME based lipase, immobilization, adsorption

Procedia PDF Downloads 243
3927 In and Out-Of-Sample Performance of Non Simmetric Models in International Price Differential Forecasting in a Commodity Country Framework

Authors: Nicola Rubino

Abstract:

This paper presents an analysis of a group of commodity exporting countries' nominal exchange rate movements in relationship to the US dollar. Using a series of Unrestricted Self-exciting Threshold Autoregressive models (SETAR), we model and evaluate sixteen national CPI price differentials relative to the US dollar CPI. Out-of-sample forecast accuracy is evaluated through calculation of mean absolute error measures on the basis of two-hundred and fifty-three months rolling window forecasts and extended to three additional models, namely a logistic smooth transition regression (LSTAR), an additive non linear autoregressive model (AAR) and a simple linear Neural Network model (NNET). Our preliminary results confirm presence of some form of TAR non linearity in the majority of the countries analyzed, with a relatively higher goodness of fit, with respect to the linear AR(1) benchmark, in five countries out of sixteen considered. Although no model appears to statistically prevail over the other, our final out-of-sample forecast exercise shows that SETAR models tend to have quite poor relative forecasting performance, especially when compared to alternative non-linear specifications. Finally, by analyzing the implied half-lives of the > coefficients, our results confirms the presence, in the spirit of arbitrage band adjustment, of band convergence with an inner unit root behaviour in five of the sixteen countries analyzed.

Keywords: transition regression model, real exchange rate, nonlinearities, price differentials, PPP, commodity points

Procedia PDF Downloads 278
3926 Simulation of Government Management Model to Increase Financial Productivity System Using Govpilot

Authors: Arezou Javadi

Abstract:

The use of algorithmic models dependent on software calculations and simulation of new government management assays with the help of specialized software had increased the productivity and efficiency of the government management system recently. This has caused the management approach to change from the old bitch & fix model, which has low efficiency and less usefulness, to the capable management model with higher efficiency called the partnership with resident model. By using Govpilot TM software, the relationship between people in a system and the government was examined. The method of two tailed interaction was the outsourcing of a goal in a system, which is formed in the order of goals, qualified executive people, optimal executive model, and finally, summarizing additional activities at the different statistical levels. The results showed that the participation of people in a financial implementation system with a statistical potential of P≥5% caused a significant increase in investment and initial capital in the government system with maximum implement project in a smart government.

Keywords: machine learning, financial income, statistical potential, govpilot

Procedia PDF Downloads 88
3925 Simulation of Government Management Model to Increase Financial Productivity System Using Govpilot

Authors: Arezou Javadi

Abstract:

The use of algorithmic models dependent on software calculations and simulation of new government management assays with the help of specialized software had increased the productivity and efficiency of the government management system recently. This has caused the management approach to change from the old bitch & fix model, which has low efficiency and less usefulness, to the capable management model with higher efficiency called the partnership with resident model. By using Govpilot TM software, the relationship between people in a system and the government was examined. The method of two tailed interaction was the outsourcing of a goal in a system, which is formed in the order of goals, qualified executive people, optimal executive model, and finally, summarizing additional activities at the different statistical levels. The results showed that the participation of people in a financial implementation system with a statistical potential of P≥5% caused a significant increase in investment and initial capital in the government system with maximum implement project in a smart government.

Keywords: machine learning, financial income, statistical potential, govpilot

Procedia PDF Downloads 70
3924 Statistical Channel Modeling for Multiple-Input-Multiple-Output Communication System

Authors: M. I. Youssef, A. E. Emam, M. Abd Elghany

Abstract:

The performance of wireless communication systems is affected mainly by the environment of its associated channel, which is characterized by dynamic and unpredictable behavior. In this paper, different statistical earth-satellite channel models are studied with emphasize on two main models, first is the Rice-Log normal model, due to its representation for the environment including shadowing and multi-path components that affect the propagated signal along its path, and a three-state model that take into account different fading conditions (clear area, moderate shadow and heavy shadowing). The provided models are based on AWGN, Rician, Rayleigh, and log-normal distributions were their Probability Density Functions (PDFs) are presented. The transmission system Bit Error Rate (BER), Peak-Average-Power Ratio (PAPR), and the channel capacity vs. fading models are measured and analyzed. These simulations are implemented using MATLAB tool, and the results had shown the performance of transmission system over different channel models.

Keywords: fading channels, MIMO communication, RNS scheme, statistical modeling

Procedia PDF Downloads 149
3923 Evaluation of Egg Quality Parameters in the Isa Brown Line in Intensive Production Systems in the Ocaña Region, Norte de Santander

Authors: Meza-Quintero Myriam, Lobo Torrado Katty Andrea, Sanchez Picon Yesenia, Hurtado-Lugo Naudin

Abstract:

The objective of the study was to evaluate the internal and external quality of the egg in the three production housing systems: floor, cage, and grazing of laying birds of the Isa Brown line, in the laying period between weeks 35 to 41; 135 hens distributed in 3 treatments of 45 birds per repetition were used (the replicas were the seven weeks of the trial). The feeding treatment supplied in the floor and cage systems contained 114 g/bird/day; for the grazing system, 14 grams less concentrate was provided. Nine eggs were collected to be studied and analyzed in the animal nutrition laboratory (3 eggs per housing system). The random statistical model was implemented: for the statistical analysis of the data, the statistical software of IBM® Statistical Products and Services Solution (SPSS) version 2.3 was used. The evaluation and follow-up instruments were the vernier caliper for the measurement in millimeters, a YolkFan™16 from Roche DSM for the evaluation of the egg yolk pigmentation, a digital scale for the measurement in grams, a micrometer for the measurement in millimeters and evaluation in the laboratory using dry matter, ashes, and ethereal extract. The results suggested that equivalent to the size of the egg (0.04 ± 3.55) and the thickness of the shell (0.46 ± 3.55), where P-Value> 0.05 was obtained, weight albumen (0.18 ± 3.55), albumen height (0.38 ± 3.55), yolk weight (0.64 ± 3.55), yolk height (0.54 ± 3.55) and for yolk pigmentation (1.23 ± 3.55). It was concluded that the hens in the three production systems, floor, cage, and grazing, did not show significant statistical differences in the internal and external quality of the chicken in the parameters studied egg for the production system.

Keywords: biological, territories, genetic resource, egg

Procedia PDF Downloads 81