Search results for: probability weighted moment estimation
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4244

Search results for: probability weighted moment estimation

2534 The Relationship of Brand Value and Perceived Brand Quality in the Television Business: A Case Study of Television Viewers in Bangkok

Authors: Natnicha Hasoontree

Abstract:

The purpose of this paper was to study the relationship between brand value and perceived brand quality of television viewers in Bangkok towards the television business in Thailand. The population included television viewers in Bangkok, Thailand. A probability sampling technique was performed to get a sample group that included 500 respondents. Taro Yamane technique was utilized to get a proper sample size. A five Likert scale questionnaire was designed specifically to investigate brand value and perceived brand quality from the perspectives of television viewers in Bangkok. The findings implied that consumers in Bangkok attached a high importance towards the brand equity of television companies that comprised brand ability, brand reputation, brand credibility, and business ethics. Perceived brand quality received high rank in all aspects.

Keywords: brand value, perceived brand quality, television business, television viewers

Procedia PDF Downloads 422
2533 Development of a Model Based on Wavelets and Matrices for the Treatment of Weakly Singular Partial Integro-Differential Equations

Authors: Somveer Singh, Vineet Kumar Singh

Abstract:

We present a new model based on viscoelasticity for the Non-Newtonian fluids.We use a matrix formulated algorithm to approximate solutions of a class of partial integro-differential equations with the given initial and boundary conditions. Some numerical results are presented to simplify application of operational matrix formulation and reduce the computational cost. Convergence analysis, error estimation and numerical stability of the method are also investigated. Finally, some test examples are given to demonstrate accuracy and efficiency of the proposed method.

Keywords: Legendre Wavelets, operational matrices, partial integro-differential equation, viscoelasticity

Procedia PDF Downloads 313
2532 Reliability Analysis of Dam under Quicksand Condition

Authors: Manthan Patel, Vinit Ahlawat, Anshh Singh Claire, Pijush Samui

Abstract:

This paper focuses on the analysis of quicksand condition for a dam foundation. The quicksand condition occurs in cohesion less soil when effective stress of soil becomes zero. In a dam, the saturated sediment may appear quite solid until a sudden change in pressure or shock initiates liquefaction. This causes the sand to form a suspension and lose strength hence resulting in failure of dam. A soil profile shows different properties at different points and the values obtained are uncertain thus reliability analysis is performed. The reliability is defined as probability of safety of a system in a given environment and loading condition and it is assessed as Reliability Index. The reliability analysis of dams under quicksand condition is carried by Gaussian Process Regression (GPR). Reliability index and factor of safety relating to liquefaction of soil is analysed using GPR. The results of reliability analysis by GPR is compared to that of conventional method and it is demonstrated that on applying GPR the probabilistic analysis reduces the computational time and efforts.

Keywords: factor of safety, GPR, reliability index, quicksand

Procedia PDF Downloads 466
2531 Evaluating Forecasts Through Stochastic Loss Order

Authors: Wilmer Osvaldo Martinez, Manuel Dario Hernandez, Juan Manuel Julio

Abstract:

We propose to assess the performance of k forecast procedures by exploring the distributions of forecast errors and error losses. We argue that non systematic forecast errors minimize when their distributions are symmetric and unimodal, and that forecast accuracy should be assessed through stochastic loss order rather than expected loss order, which is the way it is customarily performed in previous work. Moreover, since forecast performance evaluation can be understood as a one way analysis of variance, we propose to explore loss distributions under two circumstances; when a strict (but unknown) joint stochastic order exists among the losses of all forecast alternatives, and when such order happens among subsets of alternative procedures. In spite of the fact that loss stochastic order is stronger than loss moment order, our proposals are at least as powerful as competing tests, and are robust to the correlation, autocorrelation and heteroskedasticity settings they consider. In addition, since our proposals do not require samples of the same size, their scope is also wider, and provided that they test the whole loss distribution instead of just loss moments, they can also be used to study forecast distributions as well. We illustrate the usefulness of our proposals by evaluating a set of real world forecasts.

Keywords: forecast evaluation, stochastic order, multiple comparison, non parametric test

Procedia PDF Downloads 76
2530 Synergy and Complementarity in Technology-Intensive Manufacturing Networks

Authors: Daidai Shen, Jean Claude Thill, Wenjia Zhang

Abstract:

This study explores the dynamics of synergy and complementarity within city networks, specifically focusing on the headquarters-subsidiary relations of firms. We begin by defining these two types of networks and establishing their pivotal roles in shaping city network structures. Utilizing the mesoscale analytic approach of weighted stochastic block modeling, we discern relational patterns between city pairs and determine connection strengths through statistical inference. Furthermore, we introduce a community detection approach to uncover the underlying structure of these networks using advanced statistical methods. Our analysis, based on comprehensive network data up to 2017, reveals the coexistence of both complementarity and synergy networks within China’s technology-intensive manufacturing cities. Notably, firms in technology hardware and office & computing machinery predominantly contribute to the complementarity city networks. In contrast, a distinct synergy city network, underpinned by the cities of Suzhou and Dongguan, emerges amidst the expansive complementarity structures in technology hardware and equipment. These findings provide new insights into the relational dynamics and structural configurations of city networks in the context of technology-intensive manufacturing, highlighting the nuanced interplay between synergy and complementarity.

Keywords: city system, complementarity, synergy network, higher-order network

Procedia PDF Downloads 20
2529 Hansen Solubility Parameter from Surface Measurements

Authors: Neveen AlQasas, Daniel Johnson

Abstract:

Membranes for water treatment are an established technology that attracts great attention due to its simplicity and cost effectiveness. However, membranes in operation suffer from the adverse effect of membrane fouling. Bio-fouling is a phenomenon that occurs at the water-membrane interface, and is a dynamic process that is initiated by the adsorption of dissolved organic material, including biomacromolecules, on the membrane surface. After initiation, attachment of microorganisms occurs, followed by biofilm growth. The biofilm blocks the pores of the membrane and consequently results in reducing the water flux. Moreover, the presence of a fouling layer can have a substantial impact on the membrane separation properties. Understanding the mechanism of the initiation phase of biofouling is a key point in eliminating the biofouling on membrane surfaces. The adhesion and attachment of different fouling materials is affected by the surface properties of the membrane materials. Therefore, surface properties of different polymeric materials had been studied in terms of their surface energies and Hansen solubility parameters (HSP). The difference between the combined HSP parameters (HSP distance) allows prediction of the affinity of two materials to each other. The possibilities of measuring the HSP of different polymer films via surface measurements, such as contact angle has been thoroughly investigated. Knowing the HSP of a membrane material and the HSP of a specific foulant, facilitate the estimation of the HSP distance between the two, and therefore the strength of attachment to the surface. Contact angle measurements using fourteen different solvents on five different polymeric films were carried out using the sessile drop method. Solvents were ranked as good or bad solvents using different ranking method and ranking was used to calculate the HSP of each polymeric film. Results clearly indicate the absence of a direct relation between contact angle values of each film and the HSP distance between each polymer film and the solvents used. Therefore, estimating HSP via contact angle alone is not sufficient. However, it was found if the surface tensions and viscosities of the used solvents are taken in to the account in the analysis of the contact angle values, a prediction of the HSP from contact angle measurements is possible. This was carried out via training of a neural network model. The trained neural network model has three inputs, contact angle value, surface tension and viscosity of solvent used. The model is able to predict the HSP distance between the used solvent and the tested polymer (material). The HSP distance prediction is further used to estimate the total and individual HSP parameters of each tested material. The results showed an accuracy of about 90% for all the five studied films

Keywords: surface characterization, hansen solubility parameter estimation, contact angle measurements, artificial neural network model, surface measurements

Procedia PDF Downloads 75
2528 The Positive Effects of Processing Instruction on the Acquisition of French as a Second Language: An Eye-Tracking Study

Authors: Cecile Laval, Harriet Lowe

Abstract:

Processing Instruction is a psycholinguistic pedagogical approach drawing insights from the Input Processing Model which establishes the initial innate strategies used by second language learners to connect form and meaning of linguistic features. With the ever-growing use of technology in Second Language Acquisition research, the present study uses eye-tracking to measure the effectiveness of Processing Instruction in the acquisition of French and its effects on learner’s cognitive strategies. The experiment was designed using a TOBII Pro-TX300 eye-tracker to measure participants’ default strategies when processing French linguistic input and any cognitive changes after receiving Processing Instruction treatment. Participants were drawn from lower intermediate adult learners of French at the University of Greenwich and randomly assigned to two groups. The study used a pre-test/post-test methodology. The pre-tests (one per linguistic item) were administered via the eye-tracker to both groups one week prior to instructional treatment. One group received full Processing Instruction treatment (explicit information on the grammatical item and on the processing strategies, and structured input activities) on the primary target linguistic feature (French past tense imperfective aspect). The second group received Processing Instruction treatment except the explicit information on the processing strategies. Three immediate post-tests on the three grammatical structures under investigation (French past tense imperfective aspect, French Subjunctive used for the expression of doubt, and the French causative construction with Faire) were administered with the eye-tracker. The eye-tracking data showed the positive change in learners’ processing of the French target features after instruction with improvement in the interpretation of the three linguistic features under investigation. 100% of participants in both groups made a statistically significant improvement (p=0.001) in the interpretation of the primary target feature (French past tense imperfective aspect) after treatment. 62.5% of participants made an improvement in the secondary target item (French Subjunctive used for the expression of doubt) and 37.5% of participants made an improvement in the cumulative target feature (French causative construction with Faire). Statistically there was no significant difference between the pre-test and post-test scores in the cumulative target feature; however, the variance approximately tripled between the pre-test and the post-test (3.9 pre-test and 9.6 post-test). This suggests that the treatment does not affect participants homogenously and implies a role for individual differences in the transfer-of-training effect of Processing Instruction. The use of eye-tracking provides an opportunity for the study of unconscious processing decisions made during moment-by-moment comprehension. The visual data from the eye-tracking demonstrates changes in participants’ processing strategies. Gaze plots from pre- and post-tests display participants fixation points changing from focusing on content words to focusing on the verb ending. This change in processing strategies can be clearly seen in the interpretation of sentences in both primary and secondary target features. This paper will present the research methodology, design and results of the experimental study using eye-tracking to investigate the primary effects and transfer-of-training effects of Processing Instruction. It will then provide evidence of the cognitive benefits of Processing Instruction in Second Language Acquisition and offer suggestion in second language teaching of grammar.

Keywords: eye-tracking, language teaching, processing instruction, second language acquisition

Procedia PDF Downloads 265
2527 Analysis of the Statistical Characterization of Significant Wave Data Exceedances for Designing Offshore Structures

Authors: Rui Teixeira, Alan O’Connor, Maria Nogal

Abstract:

The statistical theory of extreme events is progressively a topic of growing interest in all the fields of science and engineering. The changes currently experienced by the world, economic and environmental, emphasized the importance of dealing with extreme occurrences with improved accuracy. When it comes to the design of offshore structures, particularly offshore wind turbines, the importance of efficiently characterizing extreme events is of major relevance. Extreme events are commonly characterized by extreme values theory. As an alternative, the accurate modeling of the tails of statistical distributions and the characterization of the low occurrence events can be achieved with the application of the Peak-Over-Threshold (POT) methodology. The POT methodology allows for a more refined fit of the statistical distribution by truncating the data with a minimum value of a predefined threshold u. For mathematically approximating the tail of the empirical statistical distribution the Generalised Pareto is widely used. Although, in the case of the exceedances of significant wave data (H_s) the 2 parameters Weibull and the Exponential distribution, which is a specific case of the Generalised Pareto distribution, are frequently used as an alternative. The Generalized Pareto, despite the existence of practical cases where it is applied, is not completely recognized as the adequate solution to model exceedances over a certain threshold u. References that set the Generalised Pareto distribution as a secondary solution in the case of significant wave data can be identified in the literature. In this framework, the current study intends to tackle the discussion of the application of statistical models to characterize exceedances of wave data. Comparison of the application of the Generalised Pareto, the 2 parameters Weibull and the Exponential distribution are presented for different values of the threshold u. Real wave data obtained in four buoys along the Irish coast was used in the comparative analysis. Results show that the application of the statistical distributions to characterize significant wave data needs to be addressed carefully and in each particular case one of the statistical models mentioned fits better the data than the others. Depending on the value of the threshold u different results are obtained. Other variables of the fit, as the number of points and the estimation of the model parameters, are analyzed and the respective conclusions were drawn. Some guidelines on the application of the POT method are presented. Modeling the tail of the distributions shows to be, for the present case, a highly non-linear task and, due to its growing importance, should be addressed carefully for an efficient estimation of very low occurrence events.

Keywords: extreme events, offshore structures, peak-over-threshold, significant wave data

Procedia PDF Downloads 253
2526 Utilization of Multi-Criteria Evaluation in Forensic Engineering and the Expertise outside Wall Subsystem

Authors: Tomas Barnak, Libor Matejka

Abstract:

The aim of this study is to create a standard application using multi-criteria evaluation in the field of forensic engineering. This situation can occur in the professional assessment in several cases such as when it is necessary to consider more criteria variant of the structural subsystems, more variants according to several criteria based on a court claim, which requires expert advice. A problematic situation arises when it is necessary to clearly determine the ranking of the options according to established criteria, and reduce subjective evaluation. For the procurement in the field of construction which is based on the prepared text of the law not only economic criteria but also technical, technological and environmental criteria will be determined. This fact substantially changes the style of evaluation of individual bids. For the above-mentioned needs of procurement, the unification of expert’s decisions and the use of multi-criteria assessment seem to be a reasonable option. In the case of experimental verification when using multi-criteria evaluation of alternatives construction subsystem the economic, technical, technological and environmental criteria will be compared. The core of the solution is to compare a selected number of set criteria, application methods and evaluation weighting based on the weighted values assigned to each of the criteria to use multi-criteria evaluation methods. The sequence of individual variations is determined by the evaluation of the importance of the values of corresponding criteria concerning expertise in the problematic of outside wall constructional subsystems.

Keywords: criteria, expertise, multi-criteria evaluation, outside wall subsystems

Procedia PDF Downloads 306
2525 Outlier Detection in Stock Market Data using Tukey Method and Wavelet Transform

Authors: Sadam Alwadi

Abstract:

Outlier values become a problem that frequently occurs in the data observation or recording process. Thus, the need for data imputation has become an essential matter. In this work, it will make use of the methods described in the prior work to detect the outlier values based on a collection of stock market data. In order to implement the detection and find some solutions that maybe helpful for investors, real closed price data were obtained from the Amman Stock Exchange (ASE). Tukey and Maximum Overlapping Discrete Wavelet Transform (MODWT) methods will be used to impute the detect the outlier values.

Keywords: outlier values, imputation, stock market data, detecting, estimation

Procedia PDF Downloads 71
2524 Comparative Analysis of Integrated and Non-Integrated Fish Farming in Ogun State, Nigeria

Authors: B. G. Abiona

Abstract:

This study compared profitability analysis of integrated and non-integrated fish farming in Ogun State, Nigeria. Primary data were collected using interview guide. Random sampling techniques was used to select 133 non-integrated fish farmers (NIFF) and 216 integrated fish farmers (IFF) (n = 349) from the study area. Data were analyzed using Chi-square, T-test and Pearson Product moment correlation. Results showed that 92.5% of NIFF was male compared to IFF (90.7%). Also, 96.8% of IFF and 79.7% of NIFF were married. The mean ages of sampled farmers were 44 years (NIFF) and 46 years (IFF) while the mean fish farming experiences were 4 years (NIFF) and 5 years (IFF). Also, the average net profit per year of integrated fish farmers was ₦162,550 compared to NIFF (₦61,638). The chi-square analyses showed that knowledge of fish farming had significant relationship with respondents sex (χ2 = 9.44, df = 2, p < 0.05), age (r = 0.20, p< 0.05) and farming experience (r = p = 0.05). Significant differences exist between integrated and non-integrated fish farming, considering their knowledge of fish farming (t = 21.5, χ = 43.01, p < 0.05). The study concluded that IFF are more profitable compared to NIFF. It was recommended that private investors and NGOs should sponsor short training and courses which will enhance efficiency of fish farming to boost productivity among fish farmers.

Keywords: profitability analysis, farms, integration

Procedia PDF Downloads 310
2523 Human Centred Design Approach for Public Transportation

Authors: Jo Kuys, Kirsten Day

Abstract:

Improving urban transportation systems requires an emphasis on users’ end-to-end journey experience, from the moment the user steps out of their home to when they arrive at their destination. In considering such end-to-end experiences, human centred design (HCD) must be integrated from the very beginning to generate viable outcomes for the public. An HCD approach will encourage innovative outcomes while acknowledging all factors that need to be understood along the journey. We provide evidence to show that when designing for public transportation, it is not just about the physical manifestation of a particular outcome; moreover, it’s about the context and human behaviours that need to be considered throughout the design process. Humans and their behavioural factors are vitally important to successful implementation of sustainable public transport systems. Through an in-depth literature review of HCD approaches for urban transportation systems, we provide a base to exploit the benefits and highlight the importance of including HCD in public transportation projects for greater patronage, resulting in more sustainable cities. An HCD approach is critical to all public transportation projects to understand different levels of transportation design, from the setting of transport policy to implementation to infrastructure, vehicle, and interface design.

Keywords: human centred design, public transportation, urban planning, user experience

Procedia PDF Downloads 165
2522 Digital Forgery Detection by Signal Noise Inconsistency

Authors: Bo Liu, Chi-Man Pun

Abstract:

A novel technique for digital forgery detection by signal noise inconsistency is proposed in this paper. The forged area spliced from the other picture contains some features which may be inconsistent with the rest part of the image. Noise pattern and the level is a possible factor to reveal such inconsistency. To detect such noise discrepancies, the test picture is initially segmented into small pieces. The noise pattern and level of each segment are then estimated by using various filters. The noise features constructed in this step are utilized in energy-based graph cut to expose forged area in the final step. Experimental results show that our method provides a good illustration of regions with noise inconsistency in various scenarios.

Keywords: forgery detection, splicing forgery, noise estimation, noise

Procedia PDF Downloads 441
2521 A Comparison of Smoothing Spline Method and Penalized Spline Regression Method Based on Nonparametric Regression Model

Authors: Autcha Araveeporn

Abstract:

This paper presents a study about a nonparametric regression model consisting of a smoothing spline method and a penalized spline regression method. We also compare the techniques used for estimation and prediction of nonparametric regression model. We tried both methods with crude oil prices in dollars per barrel and the Stock Exchange of Thailand (SET) index. According to the results, it is concluded that smoothing spline method performs better than that of penalized spline regression method.

Keywords: nonparametric regression model, penalized spline regression method, smoothing spline method, Stock Exchange of Thailand (SET)

Procedia PDF Downloads 418
2520 Assessing the Adaptive Re-Use Potential of Buildings as Part of the Disaster Management Process

Authors: A. Esra İdemen, Sinan M. Şener, Emrah Acar

Abstract:

The technological paradigm of the disaster management field, especially in the case of governmental intervention strategies, is generally based on rapid and flexible accommodation solutions. From various technical solution patterns used to address the immediate housing needs of disaster victims, the adaptive re-use of existing buildings can be considered to be both low-cost and practical. However, there is a scarcity of analytical methods to screen, select and adapt buildings to help decision makers in cases of emergency. Following an extensive literature review, this paper aims to highlight key points and problem areas associated with the adaptive re-use of buildings within the disaster management context. In other disciplines such as real estate management, the adaptive re-use potential (ARP) of existing buildings is typically based on the prioritization of a set of technical and non-technical criteria which are then weighted to arrive at an economically viable investment decision. After a disaster, however, the assessment of the ARP of buildings requires consideration of different/additional layers of analysis which stem from general disaster management principles and the peculiarities of different types of disasters, as well as of their victims. In this paper, a discussion of the development of an adaptive re-use potential (ARP) assessment model is presented. It is thought that governmental and non-governmental decision makers who are required to take quick decisions to accommodate displaced masses following disasters are likely to benefit from the implementation of such a model.

Keywords: adaptive re-use of buildings, disaster management, temporary housing, assessment model

Procedia PDF Downloads 317
2519 Comparison of Chest Weight of Pure and Mixed Races Kabood 30-Day Squab

Authors: Sepehr Moradi, Mehdi Asadi Rad

Abstract:

The aim of this study is to evaluate and compare chest weight of pure and mixed races Kabood 30-day Pigeons to investigate about their sex, race, and some auxiliary variables. In this paper, 62 pieces of pigeons as 31 male and female pairs with equal age are studied randomly. A natural incubation was done from each pair. All produced chickens were slaughtered at 30 days age after 12 hours hunger. Then their chests were weighted by a scale with one gram precision. A covariance analysis was used since there were many auxiliary variables and unequal observations. SAS software was used for statistical analysis. Mean weight of chests in pure race (Kabood-Kabood) with 8 records, 123.8±32.3g and mixed races of Kabood-Namebar, Kabood-Parvazy, Kabood-Tizpar, Namebar-Kabood, Tizpar-Kabood, and Parvazi-Kabood with 8, 8, 6, 12, 10, and 10 records were 139.4±23.5, 7/122±23.8, 124.7±30.1, 50.3±29.3, 51.4±26.4, and 137±28.6 gr, respectively. Mean weight of 30-day chests in male and female sex were 87.3±2.5 and 82.7±2.6g, respectively. Difference chest weight of 30-day chests of Kabood-Kabood race with Kabood-Namebar, Kabood-Parvazi, Tizpar-Kabood, Kabood-Tizpar, Namebar-Kabood and Parvazi-Kabood mixed races was not significant. Effect of sex was also significant in 5% level (P<0.05), but mutual effect of sex and race was not significant. Auxiliary variable of father weight was significant in 1% level (p < 0.01), but auxiliary variable of mother weight was not significant. The results showed that most and least weights belonged to Kabood-Namebar and Namebar-Kabood.

Keywords: squab, Kabood race, 30-day chest weight, pigeons

Procedia PDF Downloads 142
2518 Modeling Intelligent Threats: Case of Continuous Attacks on a Specific Target

Authors: Asma Ben Yaghlane, Mohamed Naceur Azaiez

Abstract:

In this paper, we treat a model that falls in the area of protecting targeted systems from intelligent threats including terrorism. We introduce the concept of system survivability, in the context of continuous attacks, as the probability that a system under attack will continue operation up to some fixed time t. We define a constant attack rate (CAR) process as an attack on a targeted system that follows an exponential distribution. We consider the superposition of several CAR processes. From the attacker side, we determine the optimal attack strategy that minimizes the system survivability. We also determine the optimal strengthening strategy that maximizes the system survivability under limited defensive resources. We use operations research techniques to identify optimal strategies of each antagonist. Our results may be used as interesting starting points to develop realistic protection strategies against intentional attacks.

Keywords: CAR processes, defense/attack strategies, exponential failure, survivability

Procedia PDF Downloads 375
2517 Characteristics of Pore Pressure and Effective Stress Changes in Sandstone Reservoir Due to Hydrocarbon Production

Authors: Kurniawan Adha, Wan Ismail Wan Yusoff, Luluan Almanna Lubis

Abstract:

Preventing hazardous events during oil and gas operation is an important contribution of accurate pore pressure data. The availability of pore pressure data also contribute in reducing the operation cost. Suggested methods in pore pressure estimation were mostly complex by the many assumptions and hypothesis used. Basic properties which may have significant impact on estimation model are somehow being neglected. To date, most of pore pressure determinations are estimated by data model analysis and rarely include laboratory analysis, stratigraphy study or core check measurement. Basically, this study developed a model that might be applied to investigate the changes of pore pressure and effective stress due to hydrocarbon production. In general, this paper focused velocity model effect of pore pressure and effective stress changes due to hydrocarbon production with illustrated by changes in saturation. The core samples from Miri field from Sarawak Malaysia ware used in this study, where the formation consists of sandstone reservoir. The study area is divided into sixteen (16) layers and encompassed six facies (A-F) from the outcrop that is used for stratigraphy sequence model. The experimental work was firstly involving data collection through field study and developing stratigraphy sequence model based on outcrop study. Porosity and permeability measurements were then performed after samples were cut into 1.5 inch diameter core samples. Next, velocity was analyzed using SONIC OYO and AutoLab 500. Three (3) scenarios of saturation were also conducted to exhibit the production history of the samples used. Results from this study show the alterations of velocity for different saturation with different actions of effective stress and pore pressure. It was observed that sample with water saturation has the highest velocity while dry sample has the lowest value. In comparison with oil to samples with oil saturation, water saturated sample still leads with the highest value since water has higher fluid density than oil. Furthermore, water saturated sample exhibits velocity derived parameters, such as poisson’s ratio and P-wave velocity over S-wave velocity (Vp/Vs) The result shows that pore pressure value ware reduced due to the decreasing of fluid content. The decreasing of pore pressure result may soften the elastic mineral frame and have tendency to possess high velocity. The alteration of pore pressure by the changes in fluid content or saturation resulted in alteration of velocity value that has proportionate trend with the effective stress.

Keywords: pore pressure, effective stress, production, miri formation

Procedia PDF Downloads 271
2516 Combined Model Predictive Controller Technique for Enhancing NAO Gait Stabilization

Authors: Brahim Brahmi, Mohammed Hamza Laraki, Mohammad Habibur Rahman, Islam M. Rasedul, M. Assad Uz-Zaman

Abstract:

The humanoid robot, specifically the NAO robot must be able to provide a highly dynamic performance on the soccer field. Maintaining the balance of the humanoid robot during the required motion is considered as one of a challenging problems especially when the robot is subject to external disturbances, as contact with other robots. In this paper, a dynamic controller is proposed in order to ensure a robust walking (stabilization) and to improve the dynamic balance of the robot during its contact with the environment (external disturbances). The generation of the trajectory of the center of mass (CoM) is done by a model predictive controller (MPC) conjoined with zero moment point (ZMP) technique. Taking into account the properties of the rotational dynamics of the whole-body system, a modified previous control mixed with feedback control is employed to manage the angular momentum and the CoM’s acceleration, respectively. This latter is dedicated to provide a robust gait of the robot in the presence of the external disturbances. Simulation results are presented to show the feasibility of the proposed strategy.

Keywords: preview control, Nao robot, model predictive control

Procedia PDF Downloads 115
2515 Effect of Application of Turmeric Extract Powder Solution on the Color Changes of Non-Vital Teeth (An In-vitro study).

Authors: Haidy N. Salem, Nada O. Kamel, Shahinaz N. Hassan, Sherif M. Elhefnawy

Abstract:

Aim: to assess the effect of using turmeric powder extract on changes of tooth color with extra-coronal and intra-coronal bleaching methods. Methods: Turmeric powder extract was weighted and mixed with two different hydrogen peroxide concentrations (3% and 6%) to be used as a bleaching agent. Thirty teeth were allocated into three groups (n=10): Group A: Bleaching agent (6%) was applied on the labial surface, Group B: Bleaching agent (3%) was applied inside the pulp chamber and Group C: Extra and intra-coronal bleaching techniques were used (6% and 3% respectively). A standardized access cavity was opened in the palatal surface of each tooth in both Groups B and C. Color parameters were measured using a spectrophotometer. Results: A statistically significant difference in color difference values (∆E*) and enamel brightness (∆L*) was found between Group C and each of Groups A and B. There was no statistically significant difference in (∆E*) and (∆L*) between Group A and Group B. The highest mean value of (∆E*) and (∆L*) was found in Group C, while the least mean value was found in Group B. Conclusion: Bleaching the external and internal tooth structure with low concentrations of hydrogen peroxide solution mixed with turmeric extract has a promising effect in color enhancement.

Keywords: bleaching, hydrogen peroxide, spectrophotometer, turmeric

Procedia PDF Downloads 97
2514 On the Creep of Concrete Structures

Authors: A. Brahma

Abstract:

Analysis of deferred deformations of concrete under sustained load shows that the creep has a leading role on deferred deformations of concrete structures. Knowledge of the creep characteristics of concrete is a Necessary starting point in the design of structures for crack control. Such knowledge will enable the designer to estimate the probable deformation in pre-stressed concrete or reinforced and the appropriate steps can be taken in design to accommodate this movement. In this study, we propose a prediction model that involves the acting principal parameters on the deferred behaviour of concrete structures. For the estimation of the model parameters Levenberg-Marquardt method has proven very satisfactory. A confrontation between the experimental results and the predictions of models designed shows that it is well suited to describe the evolution of the creep of concrete structures.

Keywords: concrete structure, creep, modelling, prediction

Procedia PDF Downloads 275
2513 Exploratory Tests on Structures Resistance during Forest Fires

Authors: Luis M. Ribeiro, Jorge Raposo, Ricardo Oliveira, David Caballero, Domingos X. Viegas

Abstract:

Under the scope of European project WUIWATCH a set of experimental tests on house vulnerability was performed in order to assess the resistance of selected house components during the passage of a forest fire. Among the individual elements most affected by the passage of a wildfire the windows are the ones with greater exposure. In this sense, a set of exploratory experimental tests was designed to assess some particular aspects related to the vulnerability of windows and blinds. At the same time, the importance of leaving them closed (as well as the doors inside a house) during a wild fire was explored in order to give some scientific background to guidelines for homeowners. Three sets of tests were performed: 1. Windows and blinds resistance to heat. Three types of protective blinds were tested (aluminium, PVC and wood) on 2 types of windows (single and double pane). The objective was to assess the structures resistance. 2. The influence of air flow on the transport of burning embers inside a house. A room was built to scale, and placed inside a wind tunnel, with one window and one door on opposite sides. The objective was to assess the importance of leaving an inside door opened on the probability of burning embers entering the room. 3. The influence of the dimension of openings on a window or door related to the probability of ignition inside a house. The objective was to assess the influence of different window openings in relation to the amount of burning particles that can enter a house. The main results were: 1. The purely radiative heat source provides 1.5 KW/m2 of heat impact in the structure, while the real fire generates 10 Kw/m2. When protected by the blind, the single pane window reaches 30ºC on both sides, and the double pane window has a differential of 10º from the side facing the heat (30ºC) and the opposite side (40ºC). Unprotected window constantly increases temperature until the end of the test. Window blinds reach considerably higher temperatures. PVC loses its consistency above 150ºC and melts. 2. Leaving the inside door closed results in a positive pressure differential of +1Pa from the outside to the inside, inhibiting the air flow. Opening the door in half or full reverts the pressure differential to -6 and -8 times respectively, favouring the air flow from the outside to the inside. The number of particles entering the house follows the same tendency. 3. As the bottom opening in a window increases from 0,5 cm to 4 cm the number of particles that enter the house per second also increases greatly. From 5 cm until 80cm there is no substantial increase in the number of entering particles. This set of exploratory tests proved to be an added value in supporting guidelines for home owners, regarding self-protection in WUI areas.

Keywords: forest fire, wildland urban interface, house vulnerability, house protective elements

Procedia PDF Downloads 268
2512 Hypothesis about the Origin of the Lighting

Authors: Igor Kuzminov

Abstract:

Till now, the nature of lightning is not established. A hypothesis of the origin of lightning is proposed. The lightning charge is formed by electromagnetic induction. The role of the conductor is performed by the air mass of the cloud. This conductor moves in the Earth's magnetic field. The upper and lower edges of the cloud are the plates of the capacitor. Lightning is a special case of electromagnetic processes in an atmosphere. The category of lightning occurs in the process of accumulation of a charge. The process of accumulation goes constantly, but the charge is not fixed. Naturally, the hypothesis demands the carrying out of additional experiments and official acknowledgement. As the proof of a hypothesis can serve that the maximal lighting activity in an equatorial zone where cosφ it is close to 1. An experiment conducted privately showed that there is a potential difference in the atmosphere at different levels. The probability of applied value development of power installation is great.

Keywords: electromagnetic induction, Earth's magnetic field, plates of the capacitors, charge accumulation

Procedia PDF Downloads 72
2511 Trace Network: A Probabilistic Relevant Pattern Recognition Approach to Attribution Trace Analysis

Authors: Jian Xu, Xiaochun Yun, Yongzheng Zhang, Yafei Sang, Zhenyu Cheng

Abstract:

Network attack prevention is a critical research area of information security. Network attack would be oppressed if attribution techniques are capable to trace back to the attackers after the hacking event. Therefore attributing these attacks to a particular identification becomes one of the important tasks when analysts attempt to differentiate and profile the attacker behind a piece of attack trace. To assist analysts in expose attackers behind the scenes, this paper researches on the connections between attribution traces and proposes probabilistic relevance based attribution patterns. This method facilitates the evaluation of the plausibility relevance between different traceable identifications. Furthermore, through analyzing the connections among traces, it could confirm the existence probability of a certain organization as well as discover its affinitive partners by the means of drawing relevance matrix from attribution traces.

Keywords: attribution trace, probabilistic relevance, network attack, attacker identification

Procedia PDF Downloads 342
2510 Solid Waste Disposal Site Selection in Thiruvananthapuram Corporation Area by Data Analysis Using GIS and Remote Sensing Tools

Authors: C. Asha Poorna, P. G. Vinod, A. R. R. Menon

Abstract:

Currently increasing population and their activities like urbanization and industrialization generating the greatest environmental, issue called Waste. And the major problem in waste management is selection of an appropriate site for waste disposal. The selection of suitable site have constrains like environmental, economical and political considerations. In this paper we discuss the strategies to be followed while selecting a site for decentralized system for solid waste disposal, using Geographic Information System (GIS), the Analytical Hierarchy Process (AHP) and the remote sensing method for Thiruvananthapuram corporation area. It is located on the west coast of India near the extreme south of the mainland. It lies on the shores of Killiyar and Karamana River. Being on the basin the waste managements must be regulated with the water body. The different criteria considered for waste disposal site selection are lithology, surface water, aquifer, groundwater, land use, contours, aspect, elevation, slope, and distance to road, distance from settlement are examined in relation to land fill site selection. Each criterion was identified and weighted by AHP score and mapped using GIS technique and suitable map is prepared by overlay analysis.

Keywords: waste disposal, solid waste management, Geographic Information System (GIS), Analytical Hierarchy Process (AHP)

Procedia PDF Downloads 376
2509 Comparison of Crossover Types to Obtain Optimal Queries Using Adaptive Genetic Algorithm

Authors: Wafa’ Alma'Aitah, Khaled Almakadmeh

Abstract:

this study presents an information retrieval system of using genetic algorithm to increase information retrieval efficiency. Using vector space model, information retrieval is based on the similarity measurement between query and documents. Documents with high similarity to query are judge more relevant to the query and should be retrieved first. Using genetic algorithms, each query is represented by a chromosome; these chromosomes are fed into genetic operator process: selection, crossover, and mutation until an optimized query chromosome is obtained for document retrieval. Results show that information retrieval with adaptive crossover probability and single point type crossover and roulette wheel as selection type give the highest recall. The proposed approach is verified using (242) proceedings abstracts collected from the Saudi Arabian national conference.

Keywords: genetic algorithm, information retrieval, optimal queries, crossover

Procedia PDF Downloads 274
2508 Machine Learning for Aiding Meningitis Diagnosis in Pediatric Patients

Authors: Karina Zaccari, Ernesto Cordeiro Marujo

Abstract:

This paper presents a Machine Learning (ML) approach to support Meningitis diagnosis in patients at a children’s hospital in Sao Paulo, Brazil. The aim is to use ML techniques to reduce the use of invasive procedures, such as cerebrospinal fluid (CSF) collection, as much as possible. In this study, we focus on predicting the probability of Meningitis given the results of a blood and urine laboratory tests, together with the analysis of pain or other complaints from the patient. We tested a number of different ML algorithms, including: Adaptative Boosting (AdaBoost), Decision Tree, Gradient Boosting, K-Nearest Neighbors (KNN), Logistic Regression, Random Forest and Support Vector Machines (SVM). Decision Tree algorithm performed best, with 94.56% and 96.18% accuracy for training and testing data, respectively. These results represent a significant aid to doctors in diagnosing Meningitis as early as possible and in preventing expensive and painful procedures on some children.

Keywords: machine learning, medical diagnosis, meningitis detection, pediatric research

Procedia PDF Downloads 136
2507 Institutional Capacity and Corruption: Evidence from Brazil

Authors: Dalson Figueiredo, Enivaldo Rocha, Ranulfo Paranhos, José Alexandre

Abstract:

This paper analyzes the effects of institutional capacity on corruption. Methodologically, the research design combines both descriptive and multivariate statistics to examine two original datasets based on secondary data. In particular, we employ a principal component model to estimate an indicator of institutional capacity for both state audit institutions and subnational judiciary courts. Then, we estimate the effect of institutional capacity on two dependent variables: (1) incidence of administrative irregularities and (2) time elapsed to judge corruption cases. The preliminary results using ordinary least squares, negative binomial and Tobit models suggest the same conclusions: higher the institutional audit capacity, higher is the probability of detecting a corruption case. On the other hand, higher the institutional capacity of state judiciary, the lower is the time to judge corruption cases.

Keywords: institutional capacity, corruption, state level institutions, evidence from Brazil

Procedia PDF Downloads 343
2506 An Investigation of Prior Educational Achievement on Engineering Student Performance

Authors: Jovanca Smith, Derek Gay

Abstract:

All universities possess a standard by which students are assessed and administered into their programs. This paper considers the effect of the educational history of students, as measured by specific subject grades in Caribbean examinations, on overall performance in introductory engineering math and mechanics courses. Results reflect a correlation between the highest grade in the Caribbean examinations with a higher probability of successful advancement in the university courses. Alternatively, lower entrance grades are commensurate with underperformance in the university courses. Results also demonstrate that students matriculating with the Caribbean examinations will not necessarily possess a significant advantage over students entering through an alternative route, and while previous educational background of students is a significant indicator of tentative performance in the University level math and mechanics courses, it is not the sole factor.

Keywords: bimodal distribution, differential learning, engineering education, entrance qualification

Procedia PDF Downloads 344
2505 Stochastic Variation of the Hubble's Parameter Using Ornstein-Uhlenbeck Process

Authors: Mary Chriselda A

Abstract:

This paper deals with the fact that the Hubble's parameter is not constant and tends to vary stochastically with time. This premise has been proven by converting it to a stochastic differential equation using the Ornstein-Uhlenbeck process. The formulated stochastic differential equation is further solved analytically using the Euler and the Kolmogorov Forward equations, thereby obtaining the probability density function using the Fourier transformation, thereby proving that the Hubble's parameter varies stochastically. This is further corroborated by simulating the observations using Python and R-software for validation of the premise postulated. We can further draw conclusion that the randomness in forces affecting the white noise can eventually affect the Hubble’s Parameter leading to scale invariance and thereby causing stochastic fluctuations in the density and the rate of expansion of the Universe.

Keywords: Chapman Kolmogorov forward differential equations, fourier transformation, hubble's parameter, ornstein-uhlenbeck process , stochastic differential equations

Procedia PDF Downloads 185