Search results for: statistical approach
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 16643

Search results for: statistical approach

15953 Exploring Syntactic and Semantic Features for Text-Based Authorship Attribution

Authors: Haiyan Wu, Ying Liu, Shaoyun Shi

Abstract:

Authorship attribution is to extract features to identify authors of anonymous documents. Many previous works on authorship attribution focus on statistical style features (e.g., sentence/word length), content features (e.g., frequent words, n-grams). Modeling these features by regression or some transparent machine learning methods gives a portrait of the authors' writing style. But these methods do not capture the syntactic (e.g., dependency relationship) or semantic (e.g., topics) information. In recent years, some researchers model syntactic trees or latent semantic information by neural networks. However, few works take them together. Besides, predictions by neural networks are difficult to explain, which is vital in authorship attribution tasks. In this paper, we not only utilize the statistical style and content features but also take advantage of both syntactic and semantic features. Different from an end-to-end neural model, feature selection and prediction are two steps in our method. An attentive n-gram network is utilized to select useful features, and logistic regression is applied to give prediction and understandable representation of writing style. Experiments show that our extracted features can improve the state-of-the-art methods on three benchmark datasets.

Keywords: authorship attribution, attention mechanism, syntactic feature, feature extraction

Procedia PDF Downloads 117
15952 A Probabilistic Theory of the Buy-Low and Sell-High for Algorithmic Trading

Authors: Peter Shi

Abstract:

Algorithmic trading is a rapidly expanding domain within quantitative finance, constituting a substantial portion of trading volumes in the US financial market. The demand for rigorous and robust mathematical theories underpinning these trading algorithms is ever-growing. In this study, the author establishes a new stock market model that integrates the Efficient Market Hypothesis and the statistical arbitrage. The model, for the first time, finds probabilistic relations between the rational price and the market price in terms of the conditional expectation. The theory consequently leads to a mathematical justification of the old market adage: buy-low and sell-high. The thresholds for “low” and “high” are precisely derived using a max-min operation on Bayes’s error. This explicit connection harmonizes the Efficient Market Hypothesis and Statistical Arbitrage, demonstrating their compatibility in explaining market dynamics. The amalgamation represents a pioneering contribution to quantitative finance. The study culminates in comprehensive numerical tests using historical market data, affirming that the “buy-low” and “sell-high” algorithm derived from this theory significantly outperforms the general market over the long term in four out of six distinct market environments.

Keywords: efficient market hypothesis, behavioral finance, Bayes' decision, algorithmic trading, risk control, stock market

Procedia PDF Downloads 57
15951 A PROMETHEE-BELIEF Approach for Multi-Criteria Decision Making Problems with Incomplete Information

Authors: H. Moalla, A. Frikha

Abstract:

Multi-criteria decision aid methods consider decision problems where numerous alternatives are evaluated on several criteria. These methods are used to deal with perfect information. However, in practice, it is obvious that this information requirement is too much strict. In fact, the imperfect data provided by more or less reliable decision makers usually affect decision results since any decision is closely linked to the quality and availability of information. In this paper, a PROMETHEE-BELIEF approach is proposed to help multi-criteria decisions based on incomplete information. This approach solves problems with incomplete decision matrix and unknown weights within PROMETHEE method. On the base of belief function theory, our approach first determines the distributions of belief masses based on PROMETHEE’s net flows and then calculates weights. Subsequently, it aggregates the distribution masses associated to each criterion using Murphy’s modified combination rule in order to infer a global belief structure. The final action ranking is obtained via pignistic probability transformation. A case study of real-world application concerning the location of a waste treatment center from healthcare activities with infectious risk in the center of Tunisia is studied to illustrate the detailed process of the BELIEF-PROMETHEE approach.

Keywords: belief function theory, incomplete information, multiple criteria analysis, PROMETHEE method

Procedia PDF Downloads 146
15950 Carbon Sequestration Modeling in the Implementation of REDD+ Programmes in Nigeria

Authors: Oluwafemi Samuel Oyamakin

Abstract:

The forest in Nigeria is currently estimated to extend to around 9.6 million hectares, but used to expand over central and southern Nigeria decades ago. The forest estate is shrinking due to long-term human exploitation for agricultural development, fuel wood demand, uncontrolled forest harvesting and urbanization, amongst other factors, compounded by population growth in rural areas. Nigeria has lost more than 50% of its forest cover since 1990 and currently less than 10% of the country is forested. The current deforestation rate is estimated at 3.7%, which is one of the highest in the world. Reducing Emissions from Deforestation and forest Degradation plus conservation, sustainable management of forests and enhancement of forest carbon stocks constituted what is referred to as REDD+. This study evaluated some of the existing way of computing carbon stocks using eight indigenous tree species like Mansonia, Shorea, Bombax, Terminalia superba, Khaya grandifolia, Khaya senegalenses, Pines and Gmelina arborea. While these components are the essential elements of REDD+ programme, they can be brought under a broader framework of systems analysis designed to arrive at optimal solutions for future predictions through statistical distribution pattern of carbon sequestrated by various species of tree. Available data on height and diameter of trees in Ibadan were studied and their respective potentials of carbon sequestration level were assessed and subjected to tests so as to determine the best statistical distribution that would describe the carbon sequestration pattern of trees. The result of this study suggests a reasonable statistical distribution for carbons sequestered in simulation studies and hence, allow planners and government in determining resources forecast for sustainable development especially where experiments with real-life systems are infeasible. Sustainable management of forest can then be achieved by projecting future condition of forests under different management regimes thereby supporting conservation and REDD+ programmes in Nigeria.

Keywords: REDD+, carbon, climate change, height and diameter

Procedia PDF Downloads 146
15949 An Improved Dynamic Window Approach with Environment Awareness for Local Obstacle Avoidance of Mobile Robots

Authors: Baoshan Wei, Shuai Han, Xing Zhang

Abstract:

Local obstacle avoidance is critical for mobile robot navigation. It is a challenging task to ensure path optimality and safety in cluttered environments. We proposed an Environment Aware Dynamic Window Approach in this paper to cope with the issue. The method integrates environment characterization into Dynamic Window Approach (DWA). Two strategies are proposed in order to achieve the integration. The local goal strategy guides the robot to move through openings before approaching the final goal, which solves the local minima problem in DWA. The adaptive control strategy endows the robot to adjust its state according to the environment, which addresses path safety compared with DWA. Besides, the evaluation shows that the path generated from the proposed algorithm is safer and smoother compared with state-of-the-art algorithms.

Keywords: adaptive control, dynamic window approach, environment aware, local obstacle avoidance, mobile robots

Procedia PDF Downloads 135
15948 Frequent Pattern Mining for Digenic Human Traits

Authors: Atsuko Okazaki, Jurg Ott

Abstract:

Some genetic diseases (‘digenic traits’) are due to the interaction between two DNA variants. For example, certain forms of Retinitis Pigmentosa (a genetic form of blindness) occur in the presence of two mutant variants, one in the ROM1 gene and one in the RDS gene, while the occurrence of only one of these mutant variants leads to a completely normal phenotype. Detecting such digenic traits by genetic methods is difficult. A common approach to finding disease-causing variants is to compare 100,000s of variants between individuals with a trait (cases) and those without the trait (controls). Such genome-wide association studies (GWASs) have been very successful but hinge on genetic effects of single variants, that is, there should be a difference in allele or genotype frequencies between cases and controls at a disease-causing variant. Frequent pattern mining (FPM) methods offer an avenue at detecting digenic traits even in the absence of single-variant effects. The idea is to enumerate pairs of genotypes (genotype patterns) with each of the two genotypes originating from different variants that may be located at very different genomic positions. What is needed is for genotype patterns to be significantly more common in cases than in controls. Let Y = 2 refer to cases and Y = 1 to controls, with X denoting a specific genotype pattern. We are seeking association rules, ‘X → Y’, with high confidence, P(Y = 2|X), significantly higher than the proportion of cases, P(Y = 2) in the study. Clearly, generally available FPM methods are very suitable for detecting disease-associated genotype patterns. We use fpgrowth as the basic FPM algorithm and built a framework around it to enumerate high-frequency digenic genotype patterns and to evaluate their statistical significance by permutation analysis. Application to a published dataset on opioid dependence furnished results that could not be found with classical GWAS methodology. There were 143 cases and 153 healthy controls, each genotyped for 82 variants in eight genes of the opioid system. The aim was to find out whether any of these variants were disease-associated. The single-variant analysis did not lead to significant results. Application of our FPM implementation resulted in one significant (p < 0.01) genotype pattern with both genotypes in the pattern being heterozygous and originating from two variants on different chromosomes. This pattern occurred in 14 cases and none of the controls. Thus, the pattern seems quite specific to this form of substance abuse and is also rather predictive of disease. An algorithm called Multifactor Dimension Reduction (MDR) was developed some 20 years ago and has been in use in human genetics ever since. This and our algorithms share some similar properties, but they are also very different in other respects. The main difference seems to be that our algorithm focuses on patterns of genotypes while the main object of inference in MDR is the 3 × 3 table of genotypes at two variants.

Keywords: digenic traits, DNA variants, epistasis, statistical genetics

Procedia PDF Downloads 105
15947 Energetic and Exergetic Evaluation of Box-Type Solar Cookers Using Different Insulation Materials

Authors: A. K. Areamu, J. C. Igbeka

Abstract:

The performance of box-type solar cookers has been reported by several researchers but little attention was paid to the effect of the type of insulation material on the energy and exergy efficiency of these cookers. This research aimed at evaluating the energy and exergy efficiencies of the box-type cookers containing different insulation materials. Energy and exergy efficiencies of five box-type solar cookers insulated with maize cob, air (control), maize husk, coconut coir and polyurethane foam respectively were obtained over a period of three years. The cookers were evaluated using water heating test procedures in determining the energy and exergy analysis. The results were subjected to statistical analysis using ANOVA. The result shows that the average energy input for the five solar cookers were: 245.5, 252.2, 248.7, 241.5 and 245.5J respectively while their respective average energy losses were: 201.2, 212.7, 208.4, 189.1 and 199.8J. The average exergy input for five cookers were: 228.2, 234.4, 231.1, 224.4 and 228.2J respectively while their respective average exergy losses were: 223.4, 230.6, 226.9, 218.9 and 223.0J. The energy and exergy efficiency was highest in the cooker with coconut coir (37.35 and 3.90% respectively) in the first year but was lowest for air (11 and 1.07% respectively) in the third year. Statistical analysis showed significant difference between the energy and exergy efficiencies over the years. These results reiterate the importance of a good insulating material for a box-type solar cooker.

Keywords: efficiency, energy, exergy, heating insolation

Procedia PDF Downloads 353
15946 Applying Sliding Autonomy for a Human-Robot Team on USARSim

Authors: Fang Tang, Jacob Longazo

Abstract:

This paper describes a sliding autonomy approach for coordinating a team of robots to assist the human operator to accomplish tasks while adapting to new or unexpected situations by requesting help from the human operator. While sliding autonomy has been well studied in the context of controlling a single robot. Much work needs to be done to apply sliding autonomy to a multi-robot team, especially human-robot team. Our approach aims at a hierarchical sliding control structure, with components that support human-robot collaboration. We validated our approach in the USARSim simulation and demonstrated that the human-robot team's overall performance can be improved under the sliding autonomy control.

Keywords: sliding autonomy, multi-robot team, human-robot collaboration, USARSim

Procedia PDF Downloads 527
15945 Estimation of Fouling in a Cross-Flow Heat Exchanger Using Artificial Neural Network Approach

Authors: Rania Jradi, Christophe Marvillet, Mohamed Razak Jeday

Abstract:

One of the most frequently encountered problems in industrial heat exchangers is fouling, which degrades the thermal and hydraulic performances of these types of equipment, leading thus to failure if undetected. And it occurs due to the accumulation of undesired material on the heat transfer surface. So, it is necessary to know about the heat exchanger fouling dynamics to plan mitigation strategies, ensuring a sustainable and safe operation. This paper proposes an Artificial Neural Network (ANN) approach to estimate the fouling resistance in a cross-flow heat exchanger by the collection of the operating data of the phosphoric acid concentration loop. The operating data of 361 was used to validate the proposed model. The ANN attains AARD= 0.048%, MSE= 1.811x10⁻¹¹, RMSE= 4.256x 10⁻⁶ and r²=99.5 % of accuracy which confirms that it is a credible and valuable approach for industrialists and technologists who are faced with the drawbacks of fouling in heat exchangers.

Keywords: cross-flow heat exchanger, fouling, estimation, phosphoric acid concentration loop, artificial neural network approach

Procedia PDF Downloads 184
15944 Geochemistry of Nutrients in the South Lagoon of Tunis, Northeast of Tunisia, Using Multivariable Methods

Authors: Abidi Myriam, Ben Amor Rim, Gueddari Moncef

Abstract:

Understanding ecosystem response to the restoration project is essential to assess its rehabilitation. Indeed, the time elapsed after restoration is a critical indicator to shows the real of the restoration success. In this order, the south lagoon of Tunis, a shallow Mediterranean coastal area, has witnessed several pollutions. To resolve this environmental problem, a large restoration project of the lagoon was undertaken. In this restoration works, the main changes are the decrease of the residence time of the lagoon water and the nutrient concentrations. In this paper, we attempt to evaluate the trophic state of lagoon water for evaluating the risk of eutrophication after almost 16 years of its restoration. To attend this objectives water quality monitoring was untaken. In order to identify and to analyze the natural and anthropogenic factor governing the nutrients concentrations of lagoon water geochemical methods and multivariate statistical tools were used. Results show that nutrients have duel sources due to the discharge of municipal wastewater of Megrine City in the south side of the lagoon. The Carlson index shows that the South lagoon of Tunis Lagoon Tunis is eutrophic, and may show limited summer anoxia.

Keywords: geochemistry, nutrients, statistical analysis, the south lagoon of Tunis, trophic state

Procedia PDF Downloads 173
15943 Building an Absurdist Approach to the Philosophy of Science: Combining Camus and Feyerabend

Authors: Robert Herold

Abstract:

This project aims to begin building out a new approach within the philosophy of science that is based around a combination of insights from Albert Camus and Paul Feyerabend. This approach is one that will be labeled an absurdist approach as it uses, for its foundation, the philosophy of the absurd as discussed by Camus. While Camus didn’t directly discuss the philosophy of science, nor did he offer his own views on the subject in any substantial way, that doesn’t mean that his work doesn’t have applications within the philosophy of science. In fact, as is argued throughout the piece, much of the work done by Paul Feyerabend stems from a similar metaphysical and epistemological foundation as Camus. This foundation is the notion of the absurd and the inability of us as humans to reach some sort of objective truth. In modern times both Camus and Feyerabend have been largely pushed to the wayside, though Feyerabend has undoubtedly received the most unfair treatment of the two, and this is something that serves to act more as a hindrance than anything else. Much of the claims and arguments made by both Camus and Feyerabend have not been truly refuted and have simply been pushed aside by pointing to supposed contradictions or inconsistencies. However, while it would be a monumental task to attempt to discuss all of this past work, perhaps it might be better to move beyond both Camus and Feyerabend and chart a new path. This is the overall goal of this paper. This research will demonstrate that not only are the philosophies of Camus and Feyerabend surprisingly similar and able to mesh well together, they also are able to form into something that is truly more than the sum of its parts. While the task of actually building out an approach is a monumental undertaking, the plan is to use this project as a jumping-off point. As such, this paper will start by examining some of the main claims made by both Camus and Feyerabend. Once this is done, then begin weaving them together and demonstrating where the links between the philosophies of both are. Then this study will end by building out the very begging foundations of the absurdist approach to the philosophy of science.

Keywords: philosophy, philosophy of science, albert camus, paul feyerabend

Procedia PDF Downloads 227
15942 Cutting Tools in Finishing Operations for CNC Rapid Manufacturing Processes: Experimental Studies

Authors: M. N. Osman Zahid, K. Case, D. Watts

Abstract:

This paper reports an advanced approach in the application of CNC machining for rapid manufacturing processes (CNC-RM). The aim of this study is to improve the quality of machined parts by introducing different cutting tools during finishing operations. As the cutting is performed in different directions, the surfaces presented on part can be classified into several categories. Therefore, suitable cutting tools are assigned to machine particular surfaces and to improve the quality. Experimental studies have been carried out by fabricating several parts based on the suggested approach. The results provide further support for implementing this approach in rapid machining processes.

Keywords: CNC machining, end mill tool, finishing operation, rapid manufacturing

Procedia PDF Downloads 329
15941 A Machine Learning Approach for Performance Prediction Based on User Behavioral Factors in E-Learning Environments

Authors: Naduni Ranasinghe

Abstract:

E-learning environments are getting more popular than any other due to the impact of COVID19. Even though e-learning is one of the best solutions for the teaching-learning process in the academic process, it’s not without major challenges. Nowadays, machine learning approaches are utilized in the analysis of how behavioral factors lead to better adoption and how they related to better performance of the students in eLearning environments. During the pandemic, we realized the academic process in the eLearning approach had a major issue, especially for the performance of the students. Therefore, an approach that investigates student behaviors in eLearning environments using a data-intensive machine learning approach is appreciated. A hybrid approach was used to understand how each previously told variables are related to the other. A more quantitative approach was used referred to literature to understand the weights of each factor for adoption and in terms of performance. The data set was collected from previously done research to help the training and testing process in ML. Special attention was made to incorporating different dimensionality of the data to understand the dependency levels of each. Five independent variables out of twelve variables were chosen based on their impact on the dependent variable, and by considering the descriptive statistics, out of three models developed (Random Forest classifier, SVM, and Decision tree classifier), random forest Classifier (Accuracy – 0.8542) gave the highest value for accuracy. Overall, this work met its goals of improving student performance by identifying students who are at-risk and dropout, emphasizing the necessity of using both static and dynamic data.

Keywords: academic performance prediction, e learning, learning analytics, machine learning, predictive model

Procedia PDF Downloads 133
15940 On Phase Based Stereo Matching and Its Related Issues

Authors: András Rövid, Takeshi Hashimoto

Abstract:

The paper focuses on the problem of the point correspondence matching in stereo images. The proposed matching algorithm is based on the combination of simpler methods such as normalized sum of squared differences (NSSD) and a more complex phase correlation based approach, by considering the noise and other factors, as well. The speed of NSSD and the preciseness of the phase correlation together yield an efficient approach to find the best candidate point with sub-pixel accuracy in stereo image pairs. The task of the NSSD in this case is to approach the candidate pixel roughly. Afterwards the location of the candidate is refined by an enhanced phase correlation based method which in contrast to the NSSD has to run only once for each selected pixel.

Keywords: stereo matching, sub-pixel accuracy, phase correlation, SVD, NSSD

Procedia PDF Downloads 449
15939 The Effect of Non-Surgical Periodontal Therapy on Metabolic Control in Children

Authors: Areej Al-Khabbaz, Swapna Goerge, Majedah Abdul-Rasoul

Abstract:

Introduction: The most prevalent periodontal disease among children is gingivitis, and it usually becomes more severe in adolescence. A number of intervention studies suggested that resolution of periodontal inflammation can improve metabolic control in patients diagnosed with diabetes mellitus. Aim: to assess the effect of non-surgical periodontal therapy on glycemic control of children diagnosed with diabetes mellitus. Method: Twenty-eight children diagnosed with diabetes mellitus were recruited with established diagnosis diabetes for at least 1 year. Informed consent and child assent form were obtained from children and parents prior to enrolment. The dental examination for the participants was performed on the same week directly following their annual medical assessment. All patients had their glycosylated hemoglobin (HbA1c%) test one week prior to their annual medical and dental visit and 3 months following non-surgical periodontal therapy. All patients received a comprehensive periodontal examination The periodontal assessment included clinical attachment loss, bleeding on probing, plaque score, plaque index and gingival index. All patients were referred for non-surgical periodontal therapy, which included oral hygiene instruction and motivation followed by supra-gingival and subg-ingival scaling using ultrasonic and hand instruments. Statistical Analysis: Data were entered and analyzed using the Statistical Package for Social Science software (SPSS, Chicago, USA), version 18. Statistical analysis of clinical findings was performed to detect differences between the two groups in term of periodontal findings and HbA1c%. Binary logistic regression analysis was performed in order to examine which factors were significant in multivariate analysis after adjusting for confounding between effects. The regression model used the dependent variable ‘Improved glycemic control’, and the independent variables entered in the model were plaque index, gingival index, bleeding %, plaque Statistical significance was set at p < 0.05. Result: A total of 28 children. The mean age of the participants was 13.3±1.92 years. The study participants were divided into two groups; Compliant group (received dental scaling) and non-complaints group (received oral hygiene instructions only). No statistical difference was found between compliant and non-compliant group in age, gender distribution, oral hygiene practice and the level of diabetes control. There was a significant difference between compliant and non-compliant group in term of improvement of HBa1c before and after periodontal therapy. Mean gingival index was the only significant variable associated with improved glycemic control level. In conclusion, this study has demonstrated that non-surgical mechanical periodontal therapy can improve HbA1c% control. The result of this study confirmed that children with diabetes mellitus who are compliant to dental care and have routine professional scaling may have better metabolic control compared to diabetic children who are erratic with dental care.

Keywords: children, diabetes, metabolic control, periodontal therapy

Procedia PDF Downloads 142
15938 Supplier Selection by Considering Cost and Reliability

Authors: K. -H. Yang

Abstract:

Supplier selection problem is one of the important issues of supply chain problems. Two categories of methodologies include qualitative and quantitative approaches which can be applied to supplier selection problems. However, due to the complexities of the problem and lacking of reliable and quantitative data, qualitative approaches are more than quantitative approaches. This study considers operational cost and supplier’s reliability factor and solves the problem by using a quantitative approach. A mixed integer programming model is the primary analytic tool. Analyses of different scenarios with variable cost and reliability structures show that the effectiveness of this approach to the supplier selection problem.

Keywords: mixed integer programming, quantitative approach, supplier’s reliability, supplier selection

Procedia PDF Downloads 364
15937 Electroencephalography Correlates of Memorability While Viewing Advertising Content

Authors: Victor N. Anisimov, Igor E. Serov, Ksenia M. Kolkova, Natalia V. Galkina

Abstract:

The problem of memorability of the advertising content is closely connected with the key issues of neuromarketing. The memorability of the advertising content contributes to the marketing effectiveness of the promoted product. Significant directions of studying the phenomenon of memorability are the memorability of the brand (detected through the memorability of the logo) and the memorability of the product offer (detected through the memorization of dynamic audiovisual advertising content - commercial). The aim of this work is to reveal the predictors of memorization of static and dynamic audiovisual stimuli (logos and commercials). An important direction of the research was revealing differences in psychophysiological correlates of memorability between static and dynamic audiovisual stimuli. We assumed that static and dynamic images are perceived in different ways and may have a difference in the memorization process. Objective methods of recording psychophysiological parameters while watching static and dynamic audiovisual materials are well suited to achieve the aim. The electroencephalography (EEG) method was performed with the aim of identifying correlates of the memorability of various stimuli in the electrical activity of the cerebral cortex. All stimuli (in the groups of statics and dynamics separately) were divided into 2 groups – remembered and not remembered based on the results of the questioning method. The questionnaires were filled out by survey participants after viewing the stimuli not immediately, but after a time interval (for detecting stimuli recorded through long-term memorization). Using statistical method, we developed the classifier (statistical model) that predicts which group (remembered or not remembered) stimuli gets, based on psychophysiological perception. The result of the statistical model was compared with the results of the questionnaire. Conclusions: Predictors of the memorability of static and dynamic stimuli have been identified, which allows prediction of which stimuli will have a higher probability of remembering. Further developments of this study will be the creation of stimulus memory model with the possibility of recognizing the stimulus as previously seen or new. Thus, in the process of remembering the stimulus, it is planned to take into account the stimulus recognition factor, which is one of the most important tasks for neuromarketing.

Keywords: memory, commercials, neuromarketing, EEG, branding

Procedia PDF Downloads 236
15936 Determination and Qsar Modelling of Partitioning Coefficients for Some Xenobiotics in Soils and Sediments

Authors: Alaa El-Din Rezk

Abstract:

For organic xenobiotics, sorption to Aldrich humic acid is a key process controlling their mobility, bioavailability, toxicity and fate in the soil. Hydrophobic organic compounds possessing either acid or basic groups can be partially ionized (deprotonated or protonated) within the range of natural soil pH. For neutral and ionogenicxenobiotics including (neutral, acids and bases) sorption coefficients normalized to organic carbon content, Koc, have measured at different pH values. To this end, the batch equilibrium technique has been used, employing SPME combined with GC-MSD as an analytical tool. For most ionogenic compounds, sorption has been affected by both pH and pKa and can be explained through Henderson-Hasselbalch equation. The results demonstrate that when assessing the environmental fate of ionogenic compounds, their pKa and speciation under natural conditions should be taken into account. A new model has developed to predict the relationship between log Koc and pH with full statistical evaluation against other existing predictive models. Neutral solutes have displayed a good fit with the classical model using log Kow as log Koc predictor, whereas acidic and basic compounds have displayed a good fit with the LSER approach and the new proposed model. Measurement limitations of the Batch technique and SPME-GC-MSD have been found with ionic compounds.

Keywords: humic acid, log Koc, pH, pKa, SPME-GCMSD

Procedia PDF Downloads 249
15935 Pavement Roughness Prediction Systems: A Bump Integrator Approach

Authors: Manish Pal, Rumi Sutradhar

Abstract:

Pavement surface unevenness plays a pivotal role on roughness index of road which affects on riding comfort ability. Comfort ability refers to the degree of protection offered to vehicle occupants from uneven elements in the road surface. So, it is preferable to have a lower roughness index value for a better riding quality of road users. Roughness is generally defined as an expression of irregularities in the pavement surface which can be measured using different equipment like MERLIN, Bump integrator, Profilometer etc. Among them Bump Integrator is quite simple and less time consuming in case of long road sections. A case study is conducted on low volume roads in West District in Tripura to determine roughness index (RI) using Bump Integrator at the standard speed of 32 km/h. But it becomes too tough to maintain the requisite standard speed throughout the road section. The speed of Bump Integrator (BI) has to lower or higher in some distinctive situations. So, it becomes necessary to convert these roughness index values of other speeds to the standard speed of 32 km/h. This paper highlights on that roughness index conversional model. Using SPSS (Statistical Package of Social Sciences) software a generalized equation is derived among the RI value at standard speed of 32 km/h and RI value at other speed conditions.

Keywords: bump integrator, pavement distresses, roughness index, SPSS

Procedia PDF Downloads 230
15934 Cat Stool as an Additive Aggregate to Garden Bricks

Authors: Mary Joy B. Amoguis, Alonah Jane D. Labtic, Hyna Wary Namoca, Aira Jane V. Original

Abstract:

Animal waste has been rapidly increasing due to the growing animal population and the lack of innovative waste management practices. In a country like the Philippines, animal waste is rampant. This study aims to minimize animal waste by producing garden bricks using cat stool as an additive. The research study analyzes different levels of concentration to determine the most efficient combination in terms of compressive strength and durability of cat stool as an additive to garden bricks. The researcher's first collects the cat stool and incinerates the different concentrations. The first concentration is 25% cat stool and 75% cement mixture. The second concentration is 50% cat stool and 50% cement mixture. And the third concentration is 75% cat stool and 25% cement mixture. The researchers analyze the statistical data using one-way ANOVA, and the statistical analysis revealed a significant difference compared to the controlled variable. The research findings show an inversely proportional relationship: the higher the concentration of cat stool additive, the lower the compressive strength of the bricks, and the lower the concentration of cat stool additive, the higher the compressive strength of the bricks.

Keywords: cat stool, garden bricks, cement, concentrations, animal wastes, compressive strength, durability, one-way ANOVA, additive, incineration, aggregates, stray cats

Procedia PDF Downloads 45
15933 Introduction to Techno-Sectoral Innovation System Modeling and Functions Formulating

Authors: S. M. Azad, H. Ghodsi Pour, F. Roshannafasa

Abstract:

In recent years ‘technology management and policymaking’ is one of the most important problems in management science. In this field, different generations of innovation and technology management are presented which the earliest one is Innovation System (IS) approach. In a general classification, innovation systems are divided in to 4 approaches: Technical, sectoral, regional, and national. There are many researches in relation to each of these approaches in different academic fields. Every approach has some benefits. If two or more approaches hybrid, their benefits would be combined. In addition, according to the sectoral structure of the governance model in Iran, in many sectors such as information technology, the combination of three other approaches with sectoral approach is essential. Hence, in this paper, combining two IS approaches (technical and sectoral) and using system dynamics, a generic model is presented for a sample of software industry. As a complimentary point, this article is introducing a new hybrid approach called Techno-Sectoral Innovation System. This TSIS model is accomplished by Changing concepts of the ‘functions’ which came from Technological IS literature and using them into sectoral system as measurable indicators.

Keywords: innovation system, technology, techno-sectoral system, functional indicators, system dynamics

Procedia PDF Downloads 421
15932 New Public Management: Step towards Democratization

Authors: Aneri Mehta, Krunal Mehta

Abstract:

Administration is largely based on two sciences: ‘management science’ and ‘political science’. The approach of new public management is more inclined towards the management science. Era of ‘New Public Management’ has affected the developing countries very immensely. Public management reforms are needed to enhance the development of the countries. This reform mainly includes capacity building, control of corruption, political decentralization, debureaucratization and public empowerment. This gives the opportunity to create self-sustaining change in the governance. This paper includes the link of approach of new public management and their effect on building effective democratization in the country. This approach mainly focuses on rationality and effectiveness of governance system. These need to have deep efforts on technological, organizational, social and cultural fields. Bringing citizen participation in governance is main objective of NPM. The shift from traditional public management to new public management have low success rate of reforms. This research includes case study of RTI which is a big step of government towards citizen centric approach of governance. The aspect of ‘publicness’ in the democratic policy implementation is important for good governance in India.

Keywords: public management, development, public empowerment, governance

Procedia PDF Downloads 488
15931 Assessment and Forecasting of the Impact of Negative Environmental Factors on Public Health

Authors: Nurlan Smagulov, Aiman Konkabayeva, Akerke Sadykova, Arailym Serik

Abstract:

Introduction. Adverse environmental factors do not immediately lead to pathological changes in the body. They can exert the growth of pre-pathology characterized by shifts in physiological, biochemical, immunological and other indicators of the body state. These disorders are unstable, reversible and indicative of body reactions. There is an opportunity to objectively judge the internal structure of the adaptive body reactions at the level of individual organs and systems. In order to obtain a stable response of the body to the chronic effects of unfavorable environmental factors of low intensity (compared to production environment factors), a time called the «lag time» is needed. The obtained results without considering this factor distort reality and, for the most part, cannot be a reliable statement of the main conclusions in any work. A technique is needed to reduce methodological errors and combine mathematical logic using statistical methods and a medical point of view, which ultimately will affect the obtained results and avoid a false correlation. Objective. Development of a methodology for assessing and predicting the environmental factors impact on the population health considering the «lag time.» Methods. Research objects: environmental and population morbidity indicators. The database on the environmental state was compiled from the monthly newsletters of Kazhydromet. Data on population morbidity were obtained from regional statistical yearbooks. When processing static data, a time interval (lag) was determined for each «argument-function» pair. That is the required interval, after which the harmful factor effect (argument) will fully manifest itself in the indicators of the organism's state (function). The lag value was determined by cross-correlation functions of arguments (environmental indicators) with functions (morbidity). Correlation coefficients (r) and their reliability (t), Fisher's criterion (F) and the influence share (R2) of the main factor (argument) per indicator (function) were calculated as a percentage. Results. The ecological situation of an industrially developed region has an impact on health indicators, but it has some nuances. Fundamentally opposite results were obtained in the mathematical data processing, considering the «lag time». Namely, an expressed correlation was revealed after two databases (ecology-morbidity) shifted. For example, the lag period was 4 years for dust concentration, general morbidity, and 3 years – for childhood morbidity. These periods accounted for the maximum values of the correlation coefficients and the largest percentage of the influencing factor. Similar results were observed in relation to the concentration of soot, dioxide, etc. The comprehensive statistical processing using multiple correlation-regression variance analysis confirms the correctness of the above statement. This method provided the integrated approach to predicting the degree of pollution of the main environmental components to identify the most dangerous combinations of concentrations of leading negative environmental factors. Conclusion. The method of assessing the «environment-public health» system (considering the «lag time») is qualitatively different from the traditional (without considering the «lag time»). The results significantly differ and are more amenable to a logical explanation of the obtained dependencies. The method allows presenting the quantitative and qualitative dependence in a different way within the «environment-public health» system.

Keywords: ecology, morbidity, population, lag time

Procedia PDF Downloads 63
15930 Expert-Driving-Criteria Based on Fuzzy Logic Approach for Intelligent Driving Diagnosis

Authors: Andrés C. Cuervo Pinilla, Christian G. Quintero M., Chinthaka Premachandra

Abstract:

This paper considers people’s driving skills diagnosis under real driving conditions. In that sense, this research presents an approach that uses GPS signals which have a direct correlation with driving maneuvers. Besides, it is presented a novel expert-driving-criteria approximation using fuzzy logic which seeks to analyze GPS signals in order to issue an intelligent driving diagnosis. Based on above, this works presents in the first section the intelligent driving diagnosis system approach in terms of its own characteristics properties, explaining in detail significant considerations about how an expert-driving-criteria approximation must be developed. In the next section, the implementation of our developed system based on the proposed fuzzy logic approach is explained. Here, a proposed set of rules which corresponds to a quantitative abstraction of some traffics laws and driving secure techniques seeking to approach an expert-driving- criteria approximation is presented. Experimental testing has been performed in real driving conditions. The testing results show that the intelligent driving diagnosis system qualifies driver’s performance quantitatively with a high degree of reliability.

Keywords: driver support systems, intelligent transportation systems, fuzzy logic, real time data processing

Procedia PDF Downloads 493
15929 The Application of System Approach to Knowledge Management and Human Resource Management Evidence from Tehran Municipality

Authors: Vajhollah Ghorbanizadeh, Seyed Mohsen Asadi, Mirali Seyednaghavi, Davoud Hoseynpour

Abstract:

In the current era, all organizations need knowledge to be able to manage the diverse human resources. Creative, dynamic and knowledge-based Human resources are important competitive advantage and the scarcest resource in today's knowledge-based economy. In addition managers with skills of knowledge management must be aware of human resource management science. It is now generally accepted that successful implementation of knowledge management requires dynamic interaction between knowledge management and human resource management. This is emphasized at systematic approach to knowledge management as well. However human resource management can be complementary of knowledge management because human resources management with the aim of empowering human resources as the key resource organizations in the 21st century, the use of other resources, creating and growing and developing today. Thus, knowledge is the major capital of every organization which is introduced through the process of knowledge management. In this context, knowledge management is systematic approach to create, receive, organize, access, and use of knowledge and learning in the organization. This article aims to define and explain the concepts of knowledge management and human resource management and the importance of these processes and concepts. Literature related to knowledge management and human resource management as well as related topics were studied, then to design, illustrate and provide a theoretical model to explain the factors affecting the relationship between knowledge management and human resource management and knowledge management system approach, for schematic design and are drawn.

Keywords: systemic approach, human resources, knowledge, human resources management, knowledge management

Procedia PDF Downloads 352
15928 Characteristics of Cumulative Distribution Function of Grown Crack Size at Specified Fatigue Crack Propagation Life under Different Maximum Fatigue Loads in AZ31

Authors: Seon Soon Choi

Abstract:

Magnesium alloy has been widely used in structure such as an automobile. It is necessary to consider probabilistic characteristics of a structural material because a fatigue behavior of a structure has a randomness and uncertainty. The purpose of this study is to find the characteristics of the cumulative distribution function (CDF) of the grown crack size at a specified fatigue crack propagation life and to investigate a statistical crack propagation in magnesium alloys. The statistical fatigue data of the grown crack size are obtained through the fatigue crack propagation (FCP) tests under different maximum fatigue load conditions conducted on the replicated specimens of magnesium alloys. The 3-parameter Weibull distribution is used to find the CDF of grown crack size. The CDF of grown crack size in case of larger maximum fatigue load has longer tail in below 10 percent and above 90 percent. The fatigue failure occurs easily as the tail of CDF of grown crack size becomes long. The fatigue behavior under the larger maximum fatigue load condition shows more rapid propagation and failure mode.

Keywords: cumulative distribution function, fatigue crack propagation, grown crack size, magnesium alloys, maximum fatigue load

Procedia PDF Downloads 272
15927 Agile Software Development Implementation in Developing a Diet Tracker Mobile Application

Authors: Dwi Puspita Sari, Gulnur Baltabayeva, Nadia Salman, Maxut Toleuov, Vijay Kanabar

Abstract:

Technology era drives people to use mobile phone to support their daily life activities. Technology development has a rapid phase which pushes the IT company to adjust any technology changes in order to fulfill customer’s satisfaction. As a result of that, many companies in the USA emerged from systematics software development approach to agile software development approach in developing systems and applications to develop many mobile phone applications in a short phase to fulfill user’s needs. As a systematic approach is considered as time consuming, costly, and too risky, agile software development has become a more popular approach to use for developing software including mobile applications. This paper reflects a short-term project to develop a diet tracker mobile application using agile software development that focused on applying scrum framework in the development process.

Keywords: agile software development, scrum, diet tracker, mobile application

Procedia PDF Downloads 239
15926 Examines the Proportionality between the Needs of Industry and Technical and Vocational Training of Male and Female Vocational Schools

Authors: Khalil Aryanfar, Pariya Gholipor, Elmira Hafez

Abstract:

This study examines the proportionality between the needs of industry and technical and vocational training of male and female vocational schools. The research method was descriptive that was conducted in two parts: documentary analysis and needs assessment and Delphi method was used in the need assessment. The statistical population of the study included 312 individuals from the industry sector employers and 52 of them were selected through stratified random sampling. Methods of data collection in this study, upstream documents include: document of the development of technical and vocational training, Statistical Yearbook 1393 in Tehran, the available documents in Isfahan Planning Department, the findings indicate that there is an almost proportionality between the needs of industry and Vocational training of male and female vocational schools in fields of welding, industrial electronics, electro technique, industrial drawing, auto mechanics, design, packaging, machine tool, metalworking, construction, accounting, computer graphics and the Administrative Affairs. The findings indicate that there is no proportionality between the needs of industry and Vocational training of male and female vocational schools in fields of Thermal - cooling systems, building electricity, building drawing, interior architecture, car electricity and motor repair.

Keywords: needs assessment, technical and vocational training, industry

Procedia PDF Downloads 436
15925 An Adaptive Dimensionality Reduction Approach for Hyperspectral Imagery Semantic Interpretation

Authors: Akrem Sellami, Imed Riadh Farah, Basel Solaiman

Abstract:

With the development of HyperSpectral Imagery (HSI) technology, the spectral resolution of HSI became denser, which resulted in large number of spectral bands, high correlation between neighboring, and high data redundancy. However, the semantic interpretation is a challenging task for HSI analysis due to the high dimensionality and the high correlation of the different spectral bands. In fact, this work presents a dimensionality reduction approach that allows to overcome the different issues improving the semantic interpretation of HSI. Therefore, in order to preserve the spatial information, the Tensor Locality Preserving Projection (TLPP) has been applied to transform the original HSI. In the second step, knowledge has been extracted based on the adjacency graph to describe the different pixels. Based on the transformation matrix using TLPP, a weighted matrix has been constructed to rank the different spectral bands based on their contribution score. Thus, the relevant bands have been adaptively selected based on the weighted matrix. The performance of the presented approach has been validated by implementing several experiments, and the obtained results demonstrate the efficiency of this approach compared to various existing dimensionality reduction techniques. Also, according to the experimental results, we can conclude that this approach can adaptively select the relevant spectral improving the semantic interpretation of HSI.

Keywords: band selection, dimensionality reduction, feature extraction, hyperspectral imagery, semantic interpretation

Procedia PDF Downloads 337
15924 Identifying the Factors affecting on the Success of Energy Usage Saving in Municipality of Tehran

Authors: Rojin Bana Derakhshan, Abbas Toloie

Abstract:

For the purpose of optimizing and developing energy efficiency in building, it is required to recognize key elements of success in optimization of energy consumption before performing any actions. Surveying Principal Components is one of the most valuable result of Linear Algebra because the simple and non-parametric methods are become confusing. So that energy management system implemented according to energy management system international standard ISO50001:2011 and all energy parameters in building to be measured through performing energy auditing. In this essay by simulating used of data mining, the key impressive elements on energy saving in buildings to be determined. This approach is based on data mining statistical techniques using feature selection method and fuzzy logic and convert data from massive to compressed type and used to increase the selected feature. On the other side, influence portion and amount of each energy consumption elements in energy dissipation in percent are recognized as separated norm while using obtained results from energy auditing and after measurement of all energy consuming parameters and identified variables. Accordingly, energy saving solution divided into 3 categories, low, medium and high expense solutions.

Keywords: energy saving, key elements of success, optimization of energy consumption, data mining

Procedia PDF Downloads 448