Search results for: event study methodology
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 51636

Search results for: event study methodology

50976 Multi-Objective Optimization for the Green Vehicle Routing Problem: Approach to Case Study of the Newspaper Distribution Problem

Authors: Julio C. Ferreira, Maria T. A. Steiner

Abstract:

The aim of this work is to present a solution procedure referred to here as the Multi-objective Optimization for Green Vehicle Routing Problem (MOOGVRP) to provide solutions for a case study. The proposed methodology consists of three stages to resolve Scenario A. Stage 1 consists of the “treatment” of data; Stage 2 consists of applying mathematical models of the p-Median Capacitated Problem (with the objectives of minimization of distances and homogenization of demands between groups) and the Asymmetric Traveling Salesman Problem (with the objectives of minimizing distances and minimizing time). The weighted method was used as the multi-objective procedure. In Stage 3, an analysis of the results is conducted, taking into consideration the environmental aspects related to the case study, more specifically with regard to fuel consumption and air pollutant emission. This methodology was applied to a (partial) database that addresses newspaper distribution in the municipality of Curitiba, Paraná State, Brazil. The preliminary findings for Scenario A showed that it was possible to improve the distribution of the load, reduce the mileage and the greenhouse gas by 17.32% and the journey time by 22.58% in comparison with the current scenario. The intention for future works is to use other multi-objective techniques and an expanded version of the database and explore the triple bottom line of sustainability.

Keywords: Asymmetric Traveling Salesman Problem, Green Vehicle Routing Problem, Multi-objective Optimization, p-Median Capacitated Problem

Procedia PDF Downloads 106
50975 Finding the Elastic Field in an Arbitrary Anisotropic Media by Implementing Accurate Generalized Gaussian Quadrature Solution

Authors: Hossein Kabir, Amir Hossein Hassanpour Mati-Kolaie

Abstract:

In the current study, the elastic field in an anisotropic elastic media is determined by implementing a general semi-analytical method. In this specific methodology, the displacement field is computed as a sum of finite functions with unknown coefficients. These aforementioned functions satisfy exactly both the homogeneous and inhomogeneous boundary conditions in the proposed media. It is worth mentioning that the unknown coefficients are determined by implementing the principle of minimum potential energy. The numerical integration is implemented by employing the Generalized Gaussian Quadrature solution. Furthermore, with the aid of the calculated unknown coefficients, the displacement field, as well as the other parameters of the elastic field, are obtainable as well. Finally, the comparison of the previous analytical method with the current semi-analytical method proposes the efficacy of the present methodology.

Keywords: anisotropic elastic media, semi-analytical method, elastic field, generalized gaussian quadrature solution

Procedia PDF Downloads 315
50974 Lateritic Soils from Ceara, Brazil: Sustainable Use in Constructive Blocks for Social Housing

Authors: Ivelise M. Strozberg, Juliana Sales Frota, Lucas de Oliveira Vale

Abstract:

The state of Ceara, located in the northeast region of Brazil, is abundant in lateritic soil which has been usually discarded due to its lack of agricultural potential while materials of similar nature have been used as constituents of housing constructive elements in many parts of the world, such as India and Portugal, for decades. Since many of the semi-arid housing conditions in the state of Ceara fail to meet the minimum criteria regarding comfort and safety requirements, this research proposed to study the Ceara lateritic soil and the possibility of its use as a sustainable building block constituent for social housings, collaborating to the improvement of the region living conditions. In order to achieve this objective, soil samples were collected from five different locations within the specific region, three of which presented lateritic nature, being characterized according to the Unified Soil Classification System and the MCT methodology, which is a Brazilian methodology developed during the 80’s that aimed to better describe and approach tropical soils, its characterization and behavior. Two of these samples were used to build two different miniature block prototypes, which were manually molded, heated at low temperatures -( < 300 ºC) in order to save energy and lessen the CO₂ high emission rate common in traditional burning methods- and then submitted to load tests. Among the soils tested, the one with the highest degree of laterization and greater presence of fines constituted the block with the best performance in terms of flexural strength tensions, presenting resistance gains when heated at increasing temperatures, which can indicate that this type of soil has potential towards being used as constructing material.

Keywords: constructive blocks, lateritic soil, MCT methodology, sustainability

Procedia PDF Downloads 121
50973 Applications of Building Information Modeling (BIM) in Knowledge Sharing and Management in Construction

Authors: Shu-Hui Jan, Shih-Ping Ho, Hui-Ping Tserng

Abstract:

Construction knowledge can be referred to and reused among involved project managers and job-site engineers to alleviate problems on a construction job-site and reduce the time and cost of solving problems related to constructability. This paper proposes a new methodology to provide sharing of construction knowledge by using the Building Information Modeling (BIM) approach. The main characteristics of BIM include illustrating 3D CAD-based presentations and keeping information in a digital format, and facilitation of easy updating and transfer of information in the 3D BIM environment. Using the BIM approach, project managers and engineers can gain knowledge related to 3D BIM and obtain feedback provided by job-site engineers for future reference. This study addresses the application of knowledge sharing management in the construction phase of construction projects and proposes a BIM-based Knowledge Sharing Management (BIMKSM) system for project managers and engineers. The BIMKSM system is then applied in a selected case study of a construction project in Taiwan to verify the proposed methodology and demonstrate the effectiveness of sharing knowledge in the BIM environment. The combined results demonstrate that the BIMKSM system can be used as a visual BIM-based knowledge sharing management platform by utilizing the BIM approach and web technology.

Keywords: construction knowledge management, building information modeling, project management, web-based information system

Procedia PDF Downloads 341
50972 Breaking Stress Criterion that Changes Everything We Know About Materials Failure

Authors: Ali Nour El Hajj

Abstract:

Background: The perennial deficiencies of the failure models in the materials field have profoundly and significantly impacted all associated technical fields that depend on accurate failure predictions. Many preeminent and well-known scientists from an earlier era of groundbreaking discoveries attempted to solve the issue of material failure. However, a thorough understanding of material failure has been frustratingly elusive. Objective: The heart of this study is the presentation of a methodology that identifies a newly derived one-parameter criterion as the only general failure theory for noncompressible, homogeneous, and isotropic materials subjected to multiaxial states of stress and various boundary conditions, providing the solution to this longstanding problem. This theory is the counterpart and companion piece to the theory of elasticity and is in a formalism that is suitable for broad application. Methods: Utilizing advanced finite-element analysis, the maximum internal breaking stress corresponding to the maximum applied external force is identified as a unified and universal material failure criterion for determining the structural capacity of any system, regardless of its geometry or architecture. Results: A comparison between the proposed criterion and methodology against design codes reveals that current provisions may underestimate the structural capacity by 2.17 times or overestimate the capacity by 2.096 times. It also shows that existing standards may underestimate the structural capacity by 1.4 times or overestimate the capacity by 2.49 times. Conclusion: The proposed failure criterion and methodology will pave the way for a new era in designing unconventional structural systems composed of unconventional materials.

Keywords: failure criteria, strength theory, failure mechanics, materials mechanics, rock mechanics, concrete strength, finite-element analysis, mechanical engineering, aeronautical engineering, civil engineering

Procedia PDF Downloads 74
50971 Directivity and Gain Improvement for Microstrip Array Antenna with Directors

Authors: Hassan M. Elkamchouchi, Samy H. Darwish, Yasser H. Elkamchouchi, M. E. Morsy

Abstract:

Methodology is suggested to design a linear rectangular microstrip array antenna based on Yagi antenna theory. The antenna with different directors' lengths as parasitic elements were designed, simulated, and analyzed using HFSS. The calculus and results illustrate the effectiveness of using specific parasitic elements to improve the directivity and gain for microstrip array antenna. The results have shown that the suggested methodology has the potential to be applied for improving the antenna performance. Maximum radiation intensity (Umax) of the order of 0.47w/st was recorded, directivity of 6.58dB, and gain better than 6.07dB are readily achievable for the antenna that working.

Keywords: directivity, director, microstrip antenna, gain improvment

Procedia PDF Downloads 451
50970 Optimization of Poly-β-Hydroxybutyrate Recovery from Bacillus Subtilis Using Solvent Extraction Process by Response Surface Methodology

Authors: Jayprakash Yadav, Nivedita Patra

Abstract:

Polyhydroxybutyrate (PHB) is an interesting material in the field of medical science, pharmaceutical industries, and tissue engineering because of its properties such as biodegradability, biocompatibility, hydrophobicity, and elasticity. PHB is naturally accumulated by several microbes in their cytoplasm during the metabolic process as energy reserve material. PHB can be extracted from cell biomass using halogenated hydrocarbons, chemicals, and enzymes. In this study, a cheaper and non-toxic solvent, acetone, was used for the extraction process. The different parameters like acetone percentage, and solvent pH, process temperature, and incubation periods were optimized using the Response Surface Methodology (RSM). RSM was performed and the determination coefficient (R2) value was found to be 0.8833 from the quadratic regression model with no significant lack of fit. The designed RSM model results indicated that the fitness of the response variable was significant (P-value < 0.0006) and satisfactory to denote the relationship between the responses in terms of PHB recovery and purity with respect to the values of independent variables. Optimum conditions for the maximum PHB recovery and purity were found to be solvent pH 7, extraction temperature - 43 °C, incubation time - 70 minutes, and percentage acetone – 30 % from this study. The maximum predicted PHB recovery was found to be 0.845 g/g biomass dry cell weight and the purity was found to be 97.23 % using the optimized conditions.

Keywords: acetone, PHB, RSM, halogenated hydrocarbons, extraction, bacillus subtilis.

Procedia PDF Downloads 433
50969 Simulating Human Behavior in (Un)Built Environments: Using an Actor Profiling Method

Authors: Hadas Sopher, Davide Schaumann, Yehuda E. Kalay

Abstract:

This paper addresses the shortcomings of architectural computation tools in representing human behavior in built environments, prior to construction and occupancy of those environments. Evaluating whether a design fits the needs of its future users is currently done solely post construction, or is based on the knowledge and intuition of the designer. This issue is of high importance when designing complex buildings such as hospitals, where the quality of treatment as well as patient and staff satisfaction are of major concern. Existing computational pre-occupancy human behavior evaluation methods are geared mainly to test ergonomic issues, such as wheelchair accessibility, emergency egress, etc. As such, they rely on Agent Based Modeling (ABM) techniques, which emphasize the individual user. Yet we know that most human activities are social, and involve a number of actors working together, which ABM methods cannot handle. Therefore, we present an event-based model that manages the interaction between multiple Actors, Spaces, and Activities, to describe dynamically how people use spaces. This approach requires expanding the computational representation of Actors beyond their physical description, to include psychological, social, cultural, and other parameters. The model presented in this paper includes cognitive abilities and rules that describe the response of actors to their physical and social surroundings, based on the actors’ internal status. The model has been applied in a simulation of hospital wards, and showed adaptability to a wide variety of situated behaviors and interactions.

Keywords: agent based modeling, architectural design evaluation, event modeling, human behavior simulation, spatial cognition

Procedia PDF Downloads 252
50968 Representativity Based Wasserstein Active Regression

Authors: Benjamin Bobbia, Matthias Picard

Abstract:

In recent years active learning methodologies based on the representativity of the data seems more promising to limit overfitting. The presented query methodology for regression using the Wasserstein distance measuring the representativity of our labelled dataset compared to the global distribution. In this work a crucial use of GroupSort Neural Networks is made therewith to draw a double advantage. The Wasserstein distance can be exactly expressed in terms of such neural networks. Moreover, one can provide explicit bounds for their size and depth together with rates of convergence. However, heterogeneity of the dataset is also considered by weighting the Wasserstein distance with the error of approximation at the previous step of active learning. Such an approach leads to a reduction of overfitting and high prediction performance after few steps of query. After having detailed the methodology and algorithm, an empirical study is presented in order to investigate the range of our hyperparameters. The performances of this method are compared, in terms of numbers of query needed, with other classical and recent query methods on several UCI datasets.

Keywords: active learning, Lipschitz regularization, neural networks, optimal transport, regression

Procedia PDF Downloads 78
50967 A Soft System Methodology Approach to Stakeholder Engagement in Water Sensitive Urban Design

Authors: Lina Lukusa, Ulrike Rivett

Abstract:

Poor water management can increase the extreme pressure already faced by water scarcity. Unless water management is addressed holistically, water quality and quantity will continue to degrade. A holistic approach to water management named Water Sensitive Urban Design (WSUD) has thus been created to facilitate the effective management of water. Traditionally, water management has employed a linear design approach, while WSUD requires a systematic, cyclical approach. In simple terms, WSUD assumes that everything is connected. Hence, it is critical for different stakeholders involved in WSUD to engage and reach a consensus on a solution. However, many stakeholders in WSUD have conflicting interests. Using the soft system methodology (SSM), developed by Peter Checkland, as a problem-solving method, decision-makers can understand this problematic situation from different world views. The SSM addresses ill and complex challenging situations involving human activities in a complex structured scenario. This paper demonstrates how SSM can be applied to understand the complexity of stakeholder engagement in WSUD. The paper concludes that SSM is an adequate solution to understand a complex problem better and then propose efficient solutions.

Keywords: co-design, ICT platform, soft systems methodology, water sensitive urban design

Procedia PDF Downloads 115
50966 A Methodology for Characterising the Tail Behaviour of a Distribution

Authors: Serge Provost, Yishan Zang

Abstract:

Following a review of various approaches that are utilized for classifying the tail behavior of a distribution, an easily implementable methodology that relies on an arctangent transformation is presented. The classification criterion is actually based on the difference between two specific quantiles of the transformed distribution. The resulting categories enable one to classify distributional tails as distinctly short, short, nearly medium, medium, extended medium and somewhat long, providing that at least two moments exist. Distributions possessing a single moment are said to be long tailed while those failing to have any finite moments are classified as having an extremely long tail. Several illustrative examples will be presented.

Keywords: arctangent transformation, tail classification, heavy-tailed distributions, distributional moments

Procedia PDF Downloads 117
50965 Event Data Representation Based on Time Stamp for Pedestrian Detection

Authors: Yuta Nakano, Kozo Kajiwara, Atsushi Hori, Takeshi Fujita

Abstract:

In association with the wave of electric vehicles (EV), low energy consumption systems have become more and more important. One of the key technologies to realize low energy consumption is a dynamic vision sensor (DVS), or we can call it an event sensor, neuromorphic vision sensor and so on. This sensor has several features, such as high temporal resolution, which can achieve 1 Mframe/s, and a high dynamic range (120 DB). However, the point that can contribute to low energy consumption the most is its sparsity; to be more specific, this sensor only captures the pixels that have intensity change. In other words, there is no signal in the area that does not have any intensity change. That is to say, this sensor is more energy efficient than conventional sensors such as RGB cameras because we can remove redundant data. On the other side of the advantages, it is difficult to handle the data because the data format is completely different from RGB image; for example, acquired signals are asynchronous and sparse, and each signal is composed of x-y coordinate, polarity (two values: +1 or -1) and time stamp, it does not include intensity such as RGB values. Therefore, as we cannot use existing algorithms straightforwardly, we have to design a new processing algorithm to cope with DVS data. In order to solve difficulties caused by data format differences, most of the prior arts make a frame data and feed it to deep learning such as Convolutional Neural Networks (CNN) for object detection and recognition purposes. However, even though we can feed the data, it is still difficult to achieve good performance due to a lack of intensity information. Although polarity is often used as intensity instead of RGB pixel value, it is apparent that polarity information is not rich enough. Considering this context, we proposed to use the timestamp information as a data representation that is fed to deep learning. Concretely, at first, we also make frame data divided by a certain time period, then give intensity value in response to the timestamp in each frame; for example, a high value is given on a recent signal. We expected that this data representation could capture the features, especially of moving objects, because timestamp represents the movement direction and speed. By using this proposal method, we made our own dataset by DVS fixed on a parked car to develop an application for a surveillance system that can detect persons around the car. We think DVS is one of the ideal sensors for surveillance purposes because this sensor can run for a long time with low energy consumption in a NOT dynamic situation. For comparison purposes, we reproduced state of the art method as a benchmark, which makes frames the same as us and feeds polarity information to CNN. Then, we measured the object detection performances of the benchmark and ours on the same dataset. As a result, our method achieved a maximum of 7 points greater than the benchmark in the F1 score.

Keywords: event camera, dynamic vision sensor, deep learning, data representation, object recognition, low energy consumption

Procedia PDF Downloads 89
50964 Electrospun Nanofibrous Scaffolds Modified with Collagen-I and Fibronectin with LX-2 Cells to Study Liver Fibrosis in vitro

Authors: Prativa Das, Lay Poh Tan

Abstract:

Three-dimensional microenvironment is a need to study the event cascades of liver fibrosis in vitro. Electrospun nanofibers modified with essential extracellular matrix proteins can closely mimic the random fibrous structure of native liver extracellular matrix (ECM). In this study, we fabricate a series of 3D electrospun scaffolds by wet electrospinning process modified with different ratios of collagen-I to fibronectin to achieve optimized distribution of these two ECM proteins on the fiber surface. A ratio of 3:1 of collagen-I to fibronectin was found to be optimum for surface modification of electrospun poly(lactic-co-glycolic acid) (PLGA) fibers by chemisorption process. In 3:1 collagen-I to fibronectin modified scaffolds the total protein content increased by ~2 fold compared to collagen-I modified and ~1.5 fold compared to 1:1/9:1 collagen-I to fibronectin modified scaffolds. We have cultured LX-2 cells on this scaffold over 14 days and found that LX-2 cells acquired more quiescent phenotype throughout the culture period and shown significantly lower expression of alpha smooth muscle actin and collagen-I. Thus, this system can be used as a model to study liver fibrosis by using different fibrogenic mediators in vitro.

Keywords: electrospinning, collagen-I and fibronectin, surface modification of fiber, LX-2 cells, liver fibrosis

Procedia PDF Downloads 121
50963 Technology Maps in Energy Applications Based on Patent Trends: A Case Study

Authors: Juan David Sepulveda

Abstract:

This article reflects the current stage of progress in the project “Determining technological trends in energy generation”. At first it was oriented towards finding out those trends by employing such tools as the scientometrics community had proved and accepted as effective for getting reliable results. Because a documented methodological guide for this purpose could not be found, the decision was made to reorient the scope and aim of this project, changing the degree of interest in pursuing the objectives. Therefore it was decided to propose and implement a novel guide from the elements and techniques found in the available literature. This article begins by explaining the elements and considerations taken into account when implementing and applying this methodology, and the tools that led to the implementation of a software application for patent revision. Univariate analysis helped recognize the technological leaders in the field of energy, and steered the way for a multivariate analysis of this sample, which allowed for a graphical description of the techniques of mature technologies, as well as the detection of emerging technologies. This article ends with a validation of the methodology as applied to the case of fuel cells.

Keywords: energy, technology mapping, patents, univariate analysis

Procedia PDF Downloads 470
50962 Terrorism in German and Italian Press Headlines: A Cognitive Linguistic Analysis of Conceptual Metaphors

Authors: Silvia Sommella

Abstract:

Islamic terrorism has gained a lot of media attention in the last years also because of the striking increase of terror attacks since 2014. The main aim of this paper is to illustrate the phenomenon of Islamic terrorism by applying frame semantics and metaphor analysis to German and Italian press headlines of the two online weekly publications Der Spiegel and L’Espresso between 2014 and 2019. This study focuses on how media discourse – through the use of conceptual metaphors – let arise in people a particular reception of the phenomenon of Islamic terrorism and accept governmental strategies and policies, perceiving terrorists as evildoers, as the members of an uncivilised group ‘other’ opposed to the civilised group ‘we’: two groups that are perceived as opposed. The press headlines are analyzed on the basis of the cognitive linguistics, namely Lakoff and Johnson’s conceptualization of metaphor to distinguish between abstract conceptual metaphors and specific metaphorical expressions. The study focuses on the contexts, frames, and metaphors. The method adopted in this study is Konerding’s frame semantics (1993). Konerding carried out on the basis of dictionaries – in particular of the Duden Deutsches Universalwörterbuch (Duden Universal German Dictionary) – in a pilot study of a lexicological work hyperonym reduction of substantives, working exclusively with nouns because hyperonyms usually occur in the dictionary meaning explanations as for the main elements of nominal phrases. The results of Konerding’s hyperonym type reduction is a small set of German nouns and they correspond to the highest hyperonyms, the so-called categories, matrix frames: ‘object’, ‘organism’, ‘person/actant’, ‘event’, ‘action/interaction/communication’, ‘institution/social group’, ‘surroundings’, ‘part/piece’, ‘totality/whole’, ‘state/property’. The second step of Konerding’s pilot study consists in determining the potential reference points of each category so that conventionally expectable routinized predications arise as predictors. Konerding found out which predicators the ascertained noun types can be linked to. For the purpose of this study, metaphorical expressions will be listed and categorized in conceptual metaphors and under the matrix frames that correspond to the particular conceptual metaphor. All of the corpus analyses are carried out using Ant Conc corpus software. The research will verify some previously analyzed metaphors such as TERRORISM AS WAR, A CRIME, A NATURAL EVENT, A DISEASE and will identify new conceptualizations and metaphors about Islamic terrorism, especially in the Italian language like TERRORISM AS A GAME, WARES, A DRAMATIC PLAY. Through the identification of particular frames and their construction, the research seeks to understand the public reception and the way to handle the discourse about Islamic terrorism in the above mentioned online weekly publications under a contrastive analysis in the German and in the Italian language.

Keywords: cognitive linguistics, frame semantics, Islamic terrorism, media

Procedia PDF Downloads 169
50961 Causal Estimation for the Left-Truncation Adjusted Time-Varying Covariates under the Semiparametric Transformation Models of a Survival Time

Authors: Yemane Hailu Fissuh, Zhongzhan Zhang

Abstract:

In biomedical researches and randomized clinical trials, the most commonly interested outcomes are time-to-event so-called survival data. The importance of robust models in this context is to compare the effect of randomly controlled experimental groups that have a sense of causality. Causal estimation is the scientific concept of comparing the pragmatic effect of treatments conditional to the given covariates rather than assessing the simple association of response and predictors. Hence, the causal effect based semiparametric transformation model was proposed to estimate the effect of treatment with the presence of possibly time-varying covariates. Due to its high flexibility and robustness, the semiparametric transformation model which shall be applied in this paper has been given much more attention for estimation of a causal effect in modeling left-truncated and right censored survival data. Despite its wide applications and popularity in estimating unknown parameters, the maximum likelihood estimation technique is quite complex and burdensome in estimating unknown parameters and unspecified transformation function in the presence of possibly time-varying covariates. Thus, to ease the complexity we proposed the modified estimating equations. After intuitive estimation procedures, the consistency and asymptotic properties of the estimators were derived and the characteristics of the estimators in the finite sample performance of the proposed model were illustrated via simulation studies and Stanford heart transplant real data example. To sum up the study, the bias of covariates was adjusted via estimating the density function for truncation variable which was also incorporated in the model as a covariate in order to relax the independence assumption of failure time and truncation time. Moreover, the expectation-maximization (EM) algorithm was described for the estimation of iterative unknown parameters and unspecified transformation function. In addition, the causal effect was derived by the ratio of the cumulative hazard function of active and passive experiments after adjusting for bias raised in the model due to the truncation variable.

Keywords: causal estimation, EM algorithm, semiparametric transformation models, time-to-event outcomes, time-varying covariate

Procedia PDF Downloads 119
50960 Statistical Optimization of Vanillin Production by Pycnoporus Cinnabarinus 1181

Authors: Swarali Hingse, Shraddha Digole, Uday Annapure

Abstract:

The present study investigates the biotransformation of ferulic acid to vanillin by Pycnoporus cinnabarinus and its optimization using one-factor-at-a-time method as well as statistical approach. Effect of various physicochemical parameters and medium components was studied using one-factor-at-a-time method. Screening of the significant factors was carried out using L25 Taguchi orthogonal array and then these selected significant factors were further optimized using response surface methodology (RSM). Significant media components obtained using Taguchi L25 orthogonal array were glucose, KH2PO4 and yeast extract. Further, a Box Behnken design was used to investigate the interactive effects of the three most significant media components. The final medium obtained after optimization using RSM containing glucose (34.89 g/L), diammonium tartrate (1 g/L), yeast extract (1.47 g/L), MgSO4•7H2O (0.5 g/L), KH2PO4 (0.15 g/L), and CaCl2•2H2O (20 mg/L) resulted in amplification of vanillin production from 30.88 mg/L to 187.63 mg/L.

Keywords: ferulic acid, pycnoporus cinnabarinus, response surface methodology, vanillin

Procedia PDF Downloads 374
50959 Momentum Profits and Investor Behavior

Authors: Aditya Sharma

Abstract:

Profits earned from relative strength strategy of zero-cost portfolio i.e. taking long position in winner stocks and short position in loser stocks from recent past are termed as momentum profits. In recent times, there has been lot of controversy and concern about sources of momentum profits, since the existence of these profits acts as an evidence of earning non-normal returns from publicly available information directly contradicting Efficient Market Hypothesis. Literature review reveals conflicting theories and differing evidences on sources of momentum profits. This paper aims at re-examining the sources of momentum profits in Indian capital markets. The study focuses on assessing the effect of fundamental as well as behavioral sources in order to understand the role of investor behavior in stock returns and suggest (if any) improvements to existing behavioral asset pricing models. This Paper adopts calendar time methodology to calculate momentum profits for 6 different strategies with and without skipping a month between ranking and holding period. For each J/K strategy, under this methodology, at the beginning of each month t stocks are ranked on past j month’s average returns and sorted in descending order. Stocks in upper decile are termed winners and bottom decile as losers. After ranking long and short positions are taken in winner and loser stocks respectively and both portfolios are held for next k months, in such manner that at any given point of time we have K overlapping long and short portfolios each, ranked from t-1 month to t-K month. At the end of period, returns of both long and short portfolios are calculated by taking equally weighted average across all months. Long minus short returns (LMS) are momentum profits for each strategy. Post testing for momentum profits, to study the role market risk plays in momentum profits, CAPM and Fama French three factor model adjusted LMS returns are calculated. In the final phase of studying sources, decomposing methodology has been used for breaking up the profits into unconditional means, serial correlations, and cross-serial correlations. This methodology is unbiased, can be used with the decile-based methodology and helps to test the effect of behavioral and fundamental sources altogether. From all the analysis, it was found that momentum profits do exist in Indian capital markets with market risk playing little role in defining them. Also, it was observed that though momentum profits have multiple sources (risk, serial correlations, and cross-serial correlations), cross-serial correlations plays a major role in defining these profits. The study revealed that momentum profits do have multiple sources however, cross-serial correlations i.e. the effect of returns of other stocks play a major role. This means that in addition to studying the investors` reactions to the information of the same firm it is also important to study how they react to the information of other firms. The analysis confirms that investor behavior does play an important role in stock returns and incorporating both the aspects of investors’ reactions in behavioral asset pricing models help make then better.

Keywords: investor behavior, momentum effect, sources of momentum, stock returns

Procedia PDF Downloads 298
50958 The Analysis of the Effectiveness of the Children’s Act of 2009 in Curbing Child Sexual Abuse: A Case Study of Francistown and the Surrounding Areas

Authors: Gabaikanngwe Ethel Mambo, Kinyanjui Godfrey Gichuhi

Abstract:

The study analysed the Children’s Act of 2009 of Botswana in curbing child sexual abuse (CSA) in Francistown and its surroundings. The qualitative methodology was used to collect data. Retrospective reports of CSA were obtained from various departments dealing with children. The research findings revealed the ineffectiveness of the Children’s Act of 2009 in identifying and preventing CSA. The Act has failed to deter or prevent the offenders from committing crimes against children. The study demonstrated an increase in CSA cases that were never reported. Lack of skills by the justice system exacerbated sexual molestation. The study also revealed that most CSA cases were underreported. Lastly, the study demonstrated those child victims were sexually molested by someone known to them.

Keywords: sexual abuse, molestation, incest, child

Procedia PDF Downloads 97
50957 Methodology of Construction Equipment Optimization for Earthwork

Authors: Jaehyun Choi, Hyunjung Kim, Namho Kim

Abstract:

Earthwork is one of the critical civil construction operations that require large-quantities of resources due to its intensive dependency upon construction equipment. Therefore, efficient construction equipment management can highly contribute to productivity improvements and cost savings. Earthwork operation utilizes various combinations of construction equipment in order to meet project requirements such as time and cost. Identification of site condition and construction methods should be performed in advance in order to develop a proper execution plan. The factors to be considered include capacity of equipment assigned, the method of construction, the size of the site, and the surrounding condition. In addition, optimal combination of various construction equipment should be selected. However, in real world practice, equipment utilization plan is performed based on experience and intuition of management. The researchers evaluated the efficiency of various alternatives of construction equipment combinations by utilizing the process simulation model, validated the model from a case study project, and presented a methodology to find optimized plan among alternatives.

Keywords: earthwork operation, construction equipment, process simulation, optimization

Procedia PDF Downloads 421
50956 Assessment of the Impact of Teaching Methodology on Skill Acquisition in Music Education among Students in Emmanuel Alayande University of Education, Oyo

Authors: Omotayo Abidemi Funmilayo

Abstract:

Skill acquisition in professional fields has been prioritized and considered important to demonstrate the mastery of subject matter and present oneself as an expert in such profession. The ability to acquire skills in different fields, however calls for different method from the instructor or teacher during training. Music is not an exception of such profession, where there exist different area of skills acquisition require practical performance. This paper, however, focused on the impact and effects of different methods on acquisition of practical knowledge in the handling of some musical instruments among the students of Emmanuel Alayande College of Education, Oyo. In this study, 30 students were selected and divided into two groups based on the selected area of learning, further division were made on each of the two major groups to consist of five students each, to be trained using different methodology for two months and three hours per week. Comparison of skill acquired were made using standard research instrument at reliable level of significance, test were carried out on the thirty students considered for the study based on area of skill acquisition. The students that were trained on the keyboard and saxophone using play way method, followed by the students that were trained using demonstration method while the set of students that received teaching instruction through lecture method performed below average. In conclusion, the study reveals that ability to acquire professional skill on handling musical instruments are better enhanced using play way method.

Keywords: music education, skill acquisition, keyboard, saxophone

Procedia PDF Downloads 65
50955 Enzymatic Synthesis of Olive-Based Ferulate Esters: Optimization by Response Surface Methodology

Authors: S. Mat Radzi, N. J. Abd Rahman, H. Mohd Noor, N. Ariffin

Abstract:

Ferulic acid has widespread industrial potential by virtue of its antioxidant properties. However, it is partially soluble in aqueous media, limiting their usefulness in oil-based processes in food, cosmetic, pharmaceutical, and material industry. Therefore, modification of ferulic acid should be made by producing of more lipophilic derivatives. In this study, a preliminary investigation of lipase-catalyzed trans-esterification reaction of ethyl ferulate and olive oil was investigated. The reaction was catalyzed by immobilized lipase from Candida antarctica (Novozym 435), to produce ferulate ester, a sunscreen agent. A statistical approach of Response surface methodology (RSM) was used to evaluate the interactive effects of reaction temperature (40-80°C), reaction time (4-12 hours), and amount of enzyme (0.1-0.5 g). The optimum conditions derived via RSM were reaction temperature 60°C, reaction time 2.34 hours, and amount of enzyme 0.3 g. The actual experimental yield was 59.6% ferulate ester under optimum condition, which compared well to the maximum predicted value of 58.0%.

Keywords: ferulic acid, enzymatic synthesis, esters, RSM

Procedia PDF Downloads 326
50954 Development of National Scale Hydropower Resource Assessment Scheme Using SWAT and Geospatial Techniques

Authors: Rowane May A. Fesalbon, Greyland C. Agno, Jodel L. Cuasay, Dindo A. Malonzo, Ma. Rosario Concepcion O. Ang

Abstract:

The Department of Energy of the Republic of the Philippines estimates that the country’s energy reserves for 2015 are dwindling– observed in the rotating power outages in several localities. To aid in the energy crisis, a national hydropower resource assessment scheme is developed. Hydropower is a resource that is derived from flowing water and difference in elevation. It is a renewable energy resource that is deemed abundant in the Philippines – being an archipelagic country that is rich in bodies of water and water resources. The objectives of this study is to develop a methodology for a national hydropower resource assessment using hydrologic modeling and geospatial techniques in order to generate resource maps for future reference and use of the government and other stakeholders. The methodology developed for this purpose is focused on two models – the implementation of the Soil and Water Assessment Tool (SWAT) for the river discharge and the use of geospatial techniques to analyze the topography and obtain the head, and generate the theoretical hydropower potential sites. The methodology is highly coupled with Geographic Information Systems to maximize the use of geodatabases and the spatial significance of the determined sites. The hydrologic model used in this workflow is SWAT integrated in the GIS software ArcGIS. The head is determined by a developed algorithm that utilizes a Synthetic Aperture Radar (SAR)-derived digital elevation model (DEM) which has a resolution of 10-meters. The initial results of the developed workflow indicate hydropower potential in the river reaches ranging from pico (less than 5 kW) to mini (1-3 MW) theoretical potential.

Keywords: ArcSWAT, renewable energy, hydrologic model, hydropower, GIS

Procedia PDF Downloads 306
50953 A Comparative Study of Sampling-Based Uncertainty Propagation with First Order Error Analysis and Percentile-Based Optimization

Authors: M. Gulam Kibria, Shourav Ahmed, Kais Zaman

Abstract:

In system analysis, the information on the uncertain input variables cause uncertainty in the system responses. Different probabilistic approaches for uncertainty representation and propagation in such cases exist in the literature. Different uncertainty representation approaches result in different outputs. Some of the approaches might result in a better estimation of system response than the other approaches. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge (MUQC) has posed challenges about uncertainty quantification. Subproblem A, the uncertainty characterization subproblem, of the challenge posed is addressed in this study. In this subproblem, the challenge is to gather knowledge about unknown model inputs which have inherent aleatory and epistemic uncertainties in them with responses (output) of the given computational model. We use two different methodologies to approach the problem. In the first methodology we use sampling-based uncertainty propagation with first order error analysis. In the other approach we place emphasis on the use of Percentile-Based Optimization (PBO). The NASA Langley MUQC’s subproblem A is developed in such a way that both aleatory and epistemic uncertainties need to be managed. The challenge problem classifies each uncertain parameter as belonging to one the following three types: (i) An aleatory uncertainty modeled as a random variable. It has a fixed functional form and known coefficients. This uncertainty cannot be reduced. (ii) An epistemic uncertainty modeled as a fixed but poorly known physical quantity that lies within a given interval. This uncertainty is reducible. (iii) A parameter might be aleatory but sufficient data might not be available to adequately model it as a single random variable. For example, the parameters of a normal variable, e.g., the mean and standard deviation, might not be precisely known but could be assumed to lie within some intervals. It results in a distributional p-box having the physical parameter with an aleatory uncertainty, but the parameters prescribing its mathematical model are subjected to epistemic uncertainties. Each of the parameters of the random variable is an unknown element of a known interval. This uncertainty is reducible. From the study, it is observed that due to practical limitations or computational expense, the sampling is not exhaustive in sampling-based methodology. That is why the sampling-based methodology has high probability of underestimating the output bounds. Therefore, an optimization-based strategy to convert uncertainty described by interval data into a probabilistic framework is necessary. This is achieved in this study by using PBO.

Keywords: aleatory uncertainty, epistemic uncertainty, first order error analysis, uncertainty quantification, percentile-based optimization

Procedia PDF Downloads 233
50952 Optimization of 3D Printing Parameters Using Machine Learning to Enhance Mechanical Properties in Fused Deposition Modeling (FDM) Technology

Authors: Darwin Junnior Sabino Diego, Brando Burgos Guerrero, Diego Arroyo Villanueva

Abstract:

Additive manufacturing, commonly known as 3D printing, has revolutionized modern manufacturing by enabling the agile creation of complex objects. However, challenges persist in the consistency and quality of printed parts, particularly in their mechanical properties. This study focuses on addressing these challenges through the optimization of printing parameters in FDM technology, using Machine Learning techniques. Our aim is to improve the mechanical properties of printed objects by optimizing parameters such as speed, temperature, and orientation. We implement a methodology that combines experimental data collection with Machine Learning algorithms to identify relationships between printing parameters and mechanical properties. The results demonstrate the potential of this methodology to enhance the quality and consistency of 3D printed products, with significant applications across various industrial fields. This research not only advances understanding of additive manufacturing but also opens new avenues for practical implementation in industrial settings.

Keywords: 3D printing, additive manufacturing, machine learning, mechanical properties

Procedia PDF Downloads 40
50951 Engineering of E-Learning Content Creation: Case Study for African Countries

Authors: María-Dolores Afonso-Suárez, Nayra Pumar-Carreras, Juan Ruiz-Alzola

Abstract:

This research addresses the use of an e-Learning creation methodology for learning objects. Throughout the process, indicators are being gathered, to determine if it responds to the main objectives of an engineering discipline. These parameters will also indicate if it is necessary to review the creation cycle and readjust any phase. Within the project developed for this study, apart from the use of structured methods, there has been a central objective: the establishment of a learning atmosphere. A place where all the professionals involved are able to collaborate, plan, solve problems and determine guides to follow in order to develop creative and innovative solutions. It has been outlined as a blended learning program with an assessment plan that proposes face to face lessons, coaching, collaboration, multimedia and web based learning objects as well as support resources. The project has been drawn as a long term task, the pilot teaching actions designed provide the preliminary results object of study. This methodology is been used in the creation of learning content for the African countries of Senegal, Mauritania and Cape Verde. It has been developed within the framework of the MACbioIDi, an Interreg European project for the International cooperation and development. The educational area of this project is focused in the training and advice of professionals of the medicine as well as engineers in the use of applications of medical imaging technology, specifically the 3DSlicer application and the Open Anatomy Browser.

Keywords: teaching contents engineering, e-learning, blended learning, international cooperation, 3dslicer, open anatomy browser

Procedia PDF Downloads 164
50950 Domain Driven Design vs Soft Domain Driven Design Frameworks

Authors: Mohammed Salahat, Steve Wade

Abstract:

This paper presents and compares the SSDDD “Systematic Soft Domain Driven Design Framework” to DDD “Domain Driven Design Framework” as a soft system approach of information systems development. The framework use SSM as a guiding methodology within which we have embedded a sequence of design tasks based on the UML leading to the implementation of a software system using the Naked Objects framework. This framework has been used in action research projects that have involved the investigation and modelling of business processes using object-oriented domain models and the implementation of software systems based on those domain models. Within this framework, Soft Systems Methodology (SSM) is used as a guiding methodology to explore the problem situation and to develop the domain model using UML for the given business domain. The framework is proposed and evaluated in our previous works, a comparison between SSDDD and DDD is presented in this paper, to show how SSDDD improved DDD as an approach to modelling and implementing business domain perspectives for Information Systems Development. The comparison process, the results, and the improvements are presented in the following sections of this paper.

Keywords: domain-driven design, soft domain-driven design, naked objects, soft language

Procedia PDF Downloads 288
50949 Mapping the Sonic Spectrum of Traditional Music and Instruments Used in Malaysian Kavadi Rituals

Authors: Ainolnaim Azizol, Valerie Ross

Abstract:

Music is as old as mankind and rituals using music such as Kavadi have been associated with social, cultural, and spiritual practices in many traditional and modern societies. Recent literature has provided scientific evidence that music affects psychological and physical changes through stimulation of brainwave. Despite such advances, the scientific study of the sonic qualities peculiar to traditional instruments and how it impacts on ritualistic activities is still lacking. This study addresses one such phenomenon. Devotees in Kavadi rituals are known to be in a state of trance state and do not experience pain nor suffer injury despite the hundreds of needles pierced through their skins. Although scientists have sought to understand how this is possible, lesser is known about the music that is used to prepare devotees to enter into the trance state. This study fills this gap of knowledge by providing scientific evidence through the identification and mapping of the sonic spectrum or sound fingerprint of the instruments and the repertoire used in these ritualistic forms in their ethnographic environment and in audio-controlled situations. The objectives are to identify and categorize the different types of traditional music used in Kavadi rituals; to record, transcribe and digitally score the musical repertoire used in the oral tradition of Kavadi rituals; to map the sonic spectrum of ritual music using spectromography and advanced music analytical software a mixed methodology will be used. This comprises ethnographic field studies using interviews, participant observation, audio-video recordings and audio-methodology using spectromography and advanced audio-technology for sonic mapping and the transcription of audio recordings into digital scores.

Keywords: sonic, traditional, ritual, Kavadi, music

Procedia PDF Downloads 238
50948 The Unique Journeys from Different Pasts to Multiple Presents in the Work of the Pritzker Prize Laureates of 2010-2020

Authors: Christakis Chatzichristou, Kyriakos Myltiadou

Abstract:

The paper discusses how the Pritzker Prize Laureates of the last decade themselves identify the various ways different aspects or interpretations of the past have influenced their design methodologies. As the recipients of what is considered to be the most prestigious award in architecture, these architects are worth examining not only because of their exemplary work but also because of the strong influence they have on architectural culture in general. Rather than attempting to interpret their projects, the methodology chosen focuses on what the architects themselves have to say on the subject. The research aims at, and, as the tabular form of the findings shows, also succeeds in revealing the numerous and diverse ways different aspects of what is termed as the Past can potentially enrich contemporary design practices.

Keywords: design methodology, Pritzker Prize Laureates, past, culture, tradition

Procedia PDF Downloads 29
50947 Application of Response Surface Methodology in Optimizing Chitosan-Argan Nutshell Beads for Radioactive Wastewater Treatment

Authors: F. F. Zahra, E. G. Touria, Y. Samia, M. Ahmed, H. Hasna, B. M. Latifa

Abstract:

The presence of radioactive contaminants in wastewater poses a significant environmental and health risk, necessitating effective treatment solutions. This study investigates the optimization of chitosan-Argan nutshell beads for the removal of radioactive elements from wastewater, utilizing Response Surface Methodology (RSM) to enhance the treatment efficiency. Chitosan, known for its biocompatibility and adsorption properties, was combined with Argan nutshell powder to form composite beads. These beads were then evaluated for their capacity to remove radioactive contaminants from synthetic wastewater. The Box-Behnken design (BBD) under RSM was employed to analyze the influence of key operational parameters, including initial contaminant concentration, pH, bead dosage, and contact time, on the removal efficiency. Experimental results indicated that all tested parameters significantly affected the removal efficiency, with initial contaminant concentration and pH showing the most substantial impact. The optimized conditions, as determined by RSM, were found to be an initial contaminant concentration of 50 mg/L, a pH of 6, a bead dosage of 0.5 g/L, and a contact time of 120 minutes. Under these conditions, the removal efficiency reached up to 95%, demonstrating the potential of chitosan-Argan nutshell beads as a viable solution for radioactive wastewater treatment. Furthermore, the adsorption process was characterized by fitting the experimental data to various isotherm and kinetic models. The adsorption isotherms conformed well to the Langmuir model, indicating monolayer adsorption, while the kinetic data were best described by the pseudo-second-order model, suggesting chemisorption as the primary mechanism. This study highlights the efficacy of chitosan-Argan nutshell beads in removing radioactive contaminants from wastewater and underscores the importance of optimizing treatment parameters using RSM. The findings provide a foundation for developing cost-effective and environmentally friendly treatment technologies for radioactive wastewater.

Keywords: adsorption, argan nutshell, beads, chitosan, mechanism, optimization, radioactive wastewater, response surface methodology

Procedia PDF Downloads 19