Search results for: Andre Di Carlo
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 526

Search results for: Andre Di Carlo

376 Integration Network ASI in Lab Automation and Networks Industrial in IFCE

Authors: Jorge Fernandes Teixeira Filho, André Oliveira Alcantara Fontenele, Érick Aragão Ribeiro

Abstract:

The constant emergence of new technologies used in automated processes makes it necessary for teachers and traders to apply new technologies in their classes. This paper presents an application of a new technology that will be employed in a didactic plant, which represents an effluent treatment process located in a laboratory of a federal educational institution. At work were studied in the first place, all components to be placed on automation laboratory in order to determine ways to program, parameterize and organize the plant. New technologies that have been implemented to the process are basically an AS-i network and a Profinet network, a SCADA system, which represented a major innovation in the laboratory. The project makes it possible to carry out in the laboratory various practices of industrial networks and SCADA systems.

Keywords: automation, industrial networks, SCADA systems, lab automation

Procedia PDF Downloads 526
375 Staphylococcus Aureus Septic Arthritis and Necrotizing Fasciitis in a Patient With Undiagnosed Diabetes Mellitus.

Authors: Pedro Batista, André Vinha, Filipe Castelo, Bárbara Costa, Ricardo Sousa, Raquel Ricardo, André Pinto

Abstract:

Background: Septic arthritis is a diagnosis that must be considered in any patient presenting with acute joint swelling and fever. Among the several risk factors for septic arthritis, such as age, rheumatoid arthritis, recent surgery, or skin infection, diabetes mellitus can sometimes be the main risk factor. Staphylococcus aureus is the most common pathogen isolated in septic arthritis; however, it is uncommon in monomicrobial necrotizing fasciitis. Objectives: A case report of concomitant septic arthritis and necrotizing fasciitis in a patient with undiagnosed diabetes based on clinical history. Study Design & Methods: We report a case of a 58-year-old Portuguese previously healthy man who presented to the emergency department with fever and left knee swelling and pain for two days. The blood work revealed ketonemia of 6.7 mmol/L and glycemia of 496 mg/dL. The vital signs were significant for a temperature of 38.5 ºC and 123 bpm of heart rate. The left knee had edema and inflammatory signs. Computed tomography of the left knee showed diffuse edema of the subcutaneous cellular tissue and soft tissue air bubbles. A diagnosis of septic arthritis and necrotising fasciitis was made. He was taken to the operating room for surgical debridement. The samples collected intraoperatively were sent for microbiological analysis, revealing infection by multi-sensitive Staphylococcus aureus. Given this result, the empiric flucloxacillin (500 mg IV) and clindamycin (1000 mg IV) were maintained for 3 weeks. On the seventh day of hospitalization, there was a significant improvement in subcutaneous and musculoskeletal tissues. After two weeks of hospitalization, there was no purulent content and partial closure of the wounds was possible. After 3 weeks, he was switched to oral antibiotics (flucloxacillin 500 mg). A week later, a urinary infection by Pseudomonas aeruginosa was diagnosed and ciprofloxacin 500 mg was administered for 7 days without complications. After 30 days of hospital admission, the patient was discharged home and recovered. Results: The final diagnosis of concomitant septic arthritis and necrotizing fasciitis was made based on the imaging findings, surgical exploration and microbiological tests results. Conclusions: Early antibiotic administration and surgical debridement are key in the management of septic arthritis and necrotizing fasciitis. Furthermore, risk factors control (euglycemic blood glucose levels) must always be taken into account given the crucial role in the patient's recovery.

Keywords: septic arthritis, Necrotizing fasciitis, diabetes, Staphylococcus Aureus

Procedia PDF Downloads 294
374 A Correlational Study between Parentification and Memory Retention among Parentified Female Adolescents: A Neurocognitive Perspective on Parentification

Authors: Mary Dorothy Roxas, Jeian Mae Dungca, Reginald Agor, Beatriz Figueroa, Lennon Andre Patricio, Honey Joy Cabahug

Abstract:

Parentification occurs when children are expected to provide instrumental or emotional caregiving within the family. It was found that parentification has the latter effect on adolescents’ cognitive and emotional vulnerability. Attachment theory helps clarify the process of parentification as it involves the relationship between the child and the parent. Carandang theory of “taga-salo” helps explain parentification in the Philippines setting. The present study examined the potential risk of parentification on adolescent’s memory retention by hypothesizing that there is a correlation between the two. The research was conducted with 249 female adolescents ages 12-24, residing in Valenzuela City. Results indicated that there is a significant inverse correlation between parentification and memory retention.

Keywords: memory retention, neurocognitive, parentification, stress

Procedia PDF Downloads 617
373 Probabilistic Analysis of Bearing Capacity of Isolated Footing using Monte Carlo Simulation

Authors: Sameer Jung Karki, Gokhan Saygili

Abstract:

The allowable bearing capacity of foundation systems is determined by applying a factor of safety to the ultimate bearing capacity. Conventional ultimate bearing capacity calculations routines are based on deterministic input parameters where the nonuniformity and inhomogeneity of soil and site properties are not accounted for. Hence, the laws of mathematics like probability calculus and statistical analysis cannot be directly applied to foundation engineering. It’s assumed that the Factor of Safety, typically as high as 3.0, incorporates the uncertainty of the input parameters. This factor of safety is estimated based on subjective judgement rather than objective facts. It is an ambiguous term. Hence, a probabilistic analysis of the bearing capacity of an isolated footing on a clayey soil is carried out by using the Monte Carlo Simulation method. This simulated model was compared with the traditional discrete model. It was found out that the bearing capacity of soil was found higher for the simulated model compared with the discrete model. This was verified by doing the sensitivity analysis. As the number of simulations was increased, there was a significant % increase of the bearing capacity compared with discrete bearing capacity. The bearing capacity values obtained by simulation was found to follow a normal distribution. While using the traditional value of Factor of safety 3, the allowable bearing capacity had lower probability (0.03717) of occurring in the field compared to a higher probability (0.15866), while using the simulation derived factor of safety of 1.5. This means the traditional factor of safety is giving us bearing capacity that is less likely occurring/available in the field. This shows the subjective nature of factor of safety, and hence probability method is suggested to address the variability of the input parameters in bearing capacity equations.

Keywords: bearing capacity, factor of safety, isolated footing, montecarlo simulation

Procedia PDF Downloads 170
372 The Impact of Dispatching with Rolling Horizon Control in Sizing Thermal Storage for Solar Tower Plant Participating in Wholesale Spot Electricity Market

Authors: Navid Mohammadzadeh, Huy Truong-Ba, Michael Cholette

Abstract:

The solar tower (ST) plant is a promising technology to exploit large-scale solar irradiation. With thermal energy storage, ST plant has the potential to shift generation to high electricity price periods. However, the size of storage limits the dispatchability of the plant, particularly when it should compete with uncertainty in forecasts of solar irradiation and electricity prices. The purpose of this study is to explore the size of storage when Rolling Horizon Control (RHC) is employed for dispatch scheduling. To this end, RHC is benchmarked against perfect knowledge (PK) forecast and two day-ahead dispatching policies. With optimisation of dispatch planning using PK policy, the optimal achievable profit for a specific size of the storage is determined. A sensitivity analysis using Monte-Carlo simulation is conducted, and the size of storage for RHC and day-ahead policies is determined with the objective of reaching the profit obtained from the PK policy. A case study is conducted for a hypothetical ST plant with thermal storage located in South Australia and intends to dispatch under two market scenarios: 1) fixed price and 2) wholesale spot price. The impact of each individual source of uncertainty on storage size is examined for January and August. The exploration of results shows that dispatching with RH controller reaches optimal achievable profit with ~15% smaller storage compared to that in day-ahead policies. The results of this study may be applied to the CSP plant design procedure.

Keywords: solar tower plant, spot market, thermal storage system, optimized dispatch planning, sensitivity analysis, Monte Carlo simulation

Procedia PDF Downloads 112
371 Optimal Design of Tuned Inerter Damper-Based System for the Control of Wind-Induced Vibration in Tall Buildings through Cultural Algorithm

Authors: Luis Lara-Valencia, Mateo Ramirez-Acevedo, Daniel Caicedo, Jose Brito, Yosef Farbiarz

Abstract:

Controlling wind-induced vibrations as well as aerodynamic forces, is an essential part of the structural design of tall buildings in order to guarantee the serviceability limit state of the structure. This paper presents a numerical investigation on the optimal design parameters of a Tuned Inerter Damper (TID) based system for the control of wind-induced vibration in tall buildings. The control system is based on the conventional TID, with the main difference that its location is changed from the ground level to the last two story-levels of the structural system. The TID tuning procedure is based on an evolutionary cultural algorithm in which the optimum design variables defined as the frequency and damping ratios were searched according to the optimization criteria of minimizing the root mean square (RMS) response of displacements at the nth story of the structure. A Monte Carlo simulation was used to represent the dynamic action of the wind in the time domain in which a time-series derived from the Davenport spectrum using eleven harmonic functions with randomly chosen phase angles was reproduced. The above-mentioned methodology was applied on a case-study derived from a 37-story prestressed concrete building with 144 m height, in which the wind action overcomes the seismic action. The results showed that the optimally tuned TID is effective to reduce the RMS response of displacements up to 25%, which demonstrates the feasibility of the system for the control of wind-induced vibrations in tall buildings.

Keywords: evolutionary cultural algorithm, Monte Carlo simulation, tuned inerter damper, wind-induced vibrations

Procedia PDF Downloads 124
370 Passive Vibration Isolation Analysis and Optimization for Mechanical Systems

Authors: Ozan Yavuz Baytemir, Ender Cigeroglu, Gokhan Osman Ozgen

Abstract:

Vibration is an important issue in the design of various components of aerospace, marine and vehicular applications. In order not to lose the components’ function and operational performance, vibration isolation design involving the optimum isolator properties selection and isolator positioning processes appear to be a critical study. Knowing the growing need for the vibration isolation system design, this paper aims to present two types of software capable of implementing modal analysis, response analysis for both random and harmonic types of excitations, static deflection analysis, Monte Carlo simulations in addition to study of parameter and location optimization for different types of isolation problem scenarios. Investigating the literature, there is no such study developing a software-based tool that is capable of implementing all those analysis, simulation and optimization studies in one platform simultaneously. In this paper, the theoretical system model is generated for a 6-DOF rigid body. The vibration isolation system of any mechanical structure is able to be optimized using hybrid method involving both global search and gradient-based methods. Defining the optimization design variables, different types of optimization scenarios are listed in detail. Being aware of the need for a user friendly vibration isolation problem solver, two types of graphical user interfaces (GUIs) are prepared and verified using a commercial finite element analysis program, Ansys Workbench 14.0. Using the analysis and optimization capabilities of those GUIs, a real application used in an air-platform is also presented as a case study at the end of the paper.

Keywords: hybrid optimization, Monte Carlo simulation, multi-degree-of-freedom system, parameter optimization, location optimization, passive vibration isolation analysis

Procedia PDF Downloads 549
369 Towards a Competitive South African Tooling Industry

Authors: Mncedisi Trinity Dewa, Andre Francois Van Der Merwe, Stephen Matope

Abstract:

Tool, Die and Mould-making (TDM) firms have been known to play a pivotal role in the growth and development of the manufacturing sectors in most economies. Their output contributes significantly to the quality, cost and delivery speed of final manufactured parts. Unfortunately, the South African Tool, Die and Mould-making manufacturers have not been competing on the local or global market in a significant way. This reality has hampered the productivity and growth of the sector thus attracting intervention. The paper explores the shortcomings South African toolmakers have to overcome to restore their competitive position globally. Results from a global benchmarking survey on the tooling sector are used to establish a roadmap of what South African toolmakers can do to become a productive, World Class force on the global market.

Keywords: competitive performance objectives, toolmakers, world-class manufacturing, lead times

Procedia PDF Downloads 504
368 Models Development of Graphical Human Interface Using Fuzzy Logic

Authors: Érick Aragão Ribeiro, George André Pereira Thé, José Marques Soares

Abstract:

Graphical Human Interface, also known as supervision software, are increasingly present in industrial processes supported by Supervisory Control and Data Acquisition (SCADA) systems and so it is evident the need for qualified developers. In order to make engineering students able to produce high quality supervision software, method for the development must be created. In this paper we propose model, based on the international standards ISO/IEC 25010 and ISO/IEC 25040, for the development of graphical human interface. When compared with to other methods through experiments, the model here presented leads to improved quality indexes, therefore help guiding the decisions of programmers. Results show the efficiency of the models and the contribution to student learning. Students assessed the training they have received and considered it satisfactory.

Keywords: software development models, software quality, supervision software, fuzzy logic

Procedia PDF Downloads 361
367 Numerical Response of Coaxial HPGe Detector for Skull and Knee Measurement

Authors: Pabitra Sahu, M. Manohari, S. Priyadharshini, R. Santhanam, S. Chandrasekaran, B. Venkatraman

Abstract:

Radiation workers of reprocessing plants have a potential for internal exposure due to actinides and fission products. Radionuclides like Americium, lead, Polonium and Europium are bone seekers and get accumulated in the skeletal part. As the major skeletal content is in the skull (13%) and knee (22%), measurements of old intake have to be carried out in the skull and knee. At the Indira Gandhi Centre for Atomic Research, a twin HPGe-based actinide monitor is used for the measurement of actinides present in bone. Efficiency estimation, which is one of the prerequisites for the quantification of radionuclides, requires anthropomorphic phantoms. Such phantoms are very limited. Hence, in this study, efficiency curves for a Twin HPGe-based actinide monitoring system are established theoretically using the FLUKA Monte Carlo method and ICRP adult male voxel phantom. In the case of skull measurement, the detector is placed over the forehead, and for knee measurement, one detector is placed over each knee. The efficiency values of radionuclides present in the knee and skull vary from 3.72E-04 to 4.19E-04 CPS/photon and 5.22E-04 to 7.07E-04 CPS/photon, respectively, for the energy range 17 to 3000keV. The efficiency curves for the measurement are established, and it is found that initially, the efficiency value increases up to 100 keV and then starts decreasing. It is found that the skull efficiency values are 4% to 63% higher than that of the knee, depending on the energy for all the energies except 17.74 keV. The reason is the closeness of the detector to the skull compared to the knee. But for 17.74 keV the efficiency of the knee is more than the skull due to the higher attenuation caused in the skull bones because of its greater thickness. The Minimum Detectable Activity (MDA) for 241Am present in the skull and knee is 9 Bq. 239Pu has a MDA of 950 Bq and 1270 Bq for knee and skull, respectively, for a counting time of 1800 sec. This paper discusses the simulation method and the results obtained in the study.

Keywords: FLUKA Monte Carlo Method, ICRP adult male voxel phantom, knee, Skull.

Procedia PDF Downloads 35
366 Quasi-Photon Monte Carlo on Radiative Heat Transfer: An Importance Sampling and Learning Approach

Authors: Utkarsh A. Mishra, Ankit Bansal

Abstract:

At high temperature, radiative heat transfer is the dominant mode of heat transfer. It is governed by various phenomena such as photon emission, absorption, and scattering. The solution of the governing integrodifferential equation of radiative transfer is a complex process, more when the effect of participating medium and wavelength properties are taken into consideration. Although a generic formulation of such radiative transport problem can be modeled for a wide variety of problems with non-gray, non-diffusive surfaces, there is always a trade-off between simplicity and accuracy of the problem. Recently, solutions of complicated mathematical problems with statistical methods based on randomization of naturally occurring phenomena have gained significant importance. Photon bundles with discrete energy can be replicated with random numbers describing the emission, absorption, and scattering processes. Photon Monte Carlo (PMC) is a simple, yet powerful technique, to solve radiative transfer problems in complicated geometries with arbitrary participating medium. The method, on the one hand, increases the accuracy of estimation, and on the other hand, increases the computational cost. The participating media -generally a gas, such as CO₂, CO, and H₂O- present complex emission and absorption spectra. To model the emission/absorption accurately with random numbers requires a weighted sampling as different sections of the spectrum carries different importance. Importance sampling (IS) was implemented to sample random photon of arbitrary wavelength, and the sampled data provided unbiased training of MC estimators for better results. A better replacement to uniform random numbers is using deterministic, quasi-random sequences. Halton, Sobol, and Faure Low-Discrepancy Sequences are used in this study. They possess better space-filling performance than the uniform random number generator and gives rise to a low variance, stable Quasi-Monte Carlo (QMC) estimators with faster convergence. An optimal supervised learning scheme was further considered to reduce the computation costs of the PMC simulation. A one-dimensional plane-parallel slab problem with participating media was formulated. The history of some randomly sampled photon bundles is recorded to train an Artificial Neural Network (ANN), back-propagation model. The flux was calculated using the standard quasi PMC and was considered to be the training target. Results obtained with the proposed model for the one-dimensional problem are compared with the exact analytical and PMC model with the Line by Line (LBL) spectral model. The approximate variance obtained was around 3.14%. Results were analyzed with respect to time and the total flux in both cases. A significant reduction in variance as well a faster rate of convergence was observed in the case of the QMC method over the standard PMC method. However, the results obtained with the ANN method resulted in greater variance (around 25-28%) as compared to the other cases. There is a great scope of machine learning models to help in further reduction of computation cost once trained successfully. Multiple ways of selecting the input data as well as various architectures will be tried such that the concerned environment can be fully addressed to the ANN model. Better results can be achieved in this unexplored domain.

Keywords: radiative heat transfer, Monte Carlo Method, pseudo-random numbers, low discrepancy sequences, artificial neural networks

Procedia PDF Downloads 209
365 Role of Spatial Variability in the Service Life Prediction of Reinforced Concrete Bridges Affected by Corrosion

Authors: Omran M. Kenshel, Alan J. O'Connor

Abstract:

Estimating the service life of Reinforced Concrete (RC) bridge structures located in corrosive marine environments of a great importance to their owners/engineers. Traditionally, bridge owners/engineers relied more on subjective engineering judgment, e.g. visual inspection, in their estimation approach. However, because financial resources are often limited, rational calculation methods of estimation are needed to aid in making reliable and more accurate predictions for the service life of RC structures. This is in order to direct funds to bridges found to be the most critical. Criticality of the structure can be considered either form the Structural Capacity (i.e. Ultimate Limit State) or from Serviceability viewpoint whichever is adopted. This paper considers the service life of the structure only from the Structural Capacity viewpoint. Considering the great variability associated with the parameters involved in the estimation process, the probabilistic approach is most suited. The probabilistic modelling adopted here used Monte Carlo simulation technique to estimate the Reliability (i.e. Probability of Failure) of the structure under consideration. In this paper the authors used their own experimental data for the Correlation Length (CL) for the most important deterioration parameters. The CL is a parameter of the Correlation Function (CF) by which the spatial fluctuation of a certain deterioration parameter is described. The CL data used here were produced by analyzing 45 chloride profiles obtained from a 30 years old RC bridge located in a marine environment. The service life of the structure were predicted in terms of the load carrying capacity of an RC bridge beam girder. The analysis showed that the influence of SV is only evident if the reliability of the structure is governed by the Flexure failure rather than by the Shear failure.

Keywords: Chloride-induced corrosion, Monte-Carlo simulation, reinforced concrete, spatial variability

Procedia PDF Downloads 462
364 Reliability Analysis of Variable Stiffness Composite Laminate Structures

Authors: A. Sohouli, A. Suleman

Abstract:

This study focuses on reliability analysis of variable stiffness composite laminate structures to investigate the potential structural improvement compared to conventional (straight fibers) composite laminate structures. A computational framework was developed which it consists of a deterministic design step and reliability analysis. The optimization part is Discrete Material Optimization (DMO) and the reliability of the structure is computed by Monte Carlo Simulation (MCS) after using Stochastic Response Surface Method (SRSM). The design driver in deterministic optimization is the maximum stiffness, while optimization method concerns certain manufacturing constraints to attain industrial relevance. These manufacturing constraints are the change of orientation between adjacent patches cannot be too large and the maximum number of successive plies of a particular fiber orientation should not be too high. Variable stiffness composites may be manufactured by Automated Fiber Machines (AFP) which provides consistent quality with good production rates. However, laps and gaps are the most important challenges to steer fibers that effect on the performance of the structures. In this study, the optimal curved fiber paths at each layer of composites are designed in the first step by DMO, and then the reliability analysis is applied to investigate the sensitivity of the structure with different standard deviations compared to the straight fiber angle composites. The random variables are material properties and loads on the structures. The results show that the variable stiffness composite laminate structures are much more reliable, even for high standard deviation of material properties, than the conventional composite laminate structures. The reason is that the variable stiffness composite laminates allow tailoring stiffness and provide the possibility of adjusting stress and strain distribution favorably in the structures.

Keywords: material optimization, Monte Carlo simulation, reliability analysis, response surface method, variable stiffness composite structures

Procedia PDF Downloads 503
363 A Comparison of Image Data Representations for Local Stereo Matching

Authors: André Smith, Amr Abdel-Dayem

Abstract:

The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.

Keywords: colour data, local stereo matching, stereo correspondence, disparity map

Procedia PDF Downloads 358
362 Identifying Concerned Citizen Communication Style During the State Parliamentary Elections in Bavaria

Authors: Volker Mittendorf, Andre Schmale

Abstract:

In this case study, we want to explore the Twitter-use of candidates during the state parliamentary elections-year 2018 in Bavaria, Germany. This paper focusses on the seven parties that probably entered the parliament. Against this background, the paper classifies the use of language as populism which itself is considered as a political communication style. First, we determine the election campaigns which started in the years 2017 on Twitter, after that we categorize the posting times of the different direct candidates in order to derive ideal types from our empirical data. Second, we have done the exploration based on the dictionary of concerned citizens which contains German political language of the right and the far right. According to that, we are analyzing the corpus with methods of text mining and social network analysis, and afterwards we display the results in a network of words of concerned citizen communication style (CCCS).

Keywords: populism, communication style, election, text mining, social media

Procedia PDF Downloads 136
361 Loan Portfolio Quality and the Bank Soundness in the Eccas: An Empirical Evaluation of Cameroonians Banks

Authors: Andre Kadandji, Mouhamadou Fall, Francois Koum Ekalle

Abstract:

This paper aims to analyze the sound banking through the effects of the damage of the loan portfolio in the Cameroonian banking sector through the Z-score. The approach is to test the effect of other CAMEL indicators and macroeconomics indicators on the relationship between the non-performing loan and the soundness of Cameroonian banks. We use a dynamic panel data, made by 13 banks for the period 2010-2013. The analysis provides a model equations embedded in panel data. For the estimation, we use the generalized method of moments to understand the effects of macroeconomic and CAMEL type variables on the ability of Cameroonian banks to face a shock. We find that the management quality and macroeconomic variables neutralize the effects of the non-performing loan on the banks soundness.

Keywords: loan portfolio, sound banking, Z-score, dynamic panel

Procedia PDF Downloads 280
360 Development of a Robust Protein Classifier to Predict EMT Status of Cervical Squamous Cell Carcinoma and Endocervical Adenocarcinoma (CESC) Tumors

Authors: ZhenlinJu, Christopher P. Vellano, RehanAkbani, Yiling Lu, Gordon B. Mills

Abstract:

The epithelial–mesenchymal transition (EMT) is a process by which epithelial cells acquire mesenchymal characteristics, such as profound disruption of cell-cell junctions, loss of apical-basolateral polarity, and extensive reorganization of the actin cytoskeleton to induce cell motility and invasion. A hallmark of EMT is its capacity to promote metastasis, which is due in part to activation of several transcription factors and subsequent downregulation of E-cadherin. Unfortunately, current approaches have yet to uncover robust protein marker sets that can classify tumors as possessing strong EMT signatures. In this study, we utilize reverse phase protein array (RPPA) data and consensus clustering methods to successfully classify a subset of cervical squamous cell carcinoma and endocervical adenocarcinoma (CESC) tumors into an EMT protein signaling group (EMT group). The overall survival (OS) of patients in the EMT group is significantly worse than those in the other Hormone and PI3K/AKT signaling groups. In addition to a shrinkage and selection method for linear regression (LASSO), we applied training/test set and Monte Carlo resampling approaches to identify a set of protein markers that predicts the EMT status of CESC tumors. We fit a logistic model to these protein markers and developed a classifier, which was fixed in the training set and validated in the testing set. The classifier robustly predicted the EMT status of the testing set with an area under the curve (AUC) of 0.975 by Receiver Operating Characteristic (ROC) analysis. This method not only identifies a core set of proteins underlying an EMT signature in cervical cancer patients, but also provides a tool to examine protein predictors that drive molecular subtypes in other diseases.

Keywords: consensus clustering, TCGA CESC, Silhouette, Monte Carlo LASSO

Procedia PDF Downloads 452
359 Consistent Testing for an Implication of Supermodular Dominance with an Application to Verifying the Effect of Geographic Knowledge Spillover

Authors: Chung Danbi, Linton Oliver, Whang Yoon-Jae

Abstract:

Supermodularity, or complementarity, is a popular concept in economics which can characterize many objective functions such as utility, social welfare, and production functions. Further, supermodular dominance captures a preference for greater interdependence among inputs of those functions, and it can be applied to examine which input set would produce higher expected utility, social welfare, or production. Therefore, we propose and justify a consistent testing for a useful implication of supermodular dominance. We also conduct Monte Carlo simulations to explore the finite sample performance of our test, with critical values obtained from the recentered bootstrap method, with and without the selective recentering, and the subsampling method. Under various parameter settings, we confirmed that our test has reasonably good size and power performance. Finally, we apply our test to compare the geographic and distant knowledge spillover in terms of their effects on social welfare using the National Bureau of Economic Research (NBER) patent data. We expect localized citing to supermodularly dominate distant citing if the geographic knowledge spillover engenders greater social welfare than distant knowledge spillover. Taking subgroups based on firm and patent characteristics, we found that there is industry-wise and patent subclass-wise difference in the pattern of supermodular dominance between localized and distant citing. We also compare the results from analyzing different time periods to see if the development of Internet and communication technology has changed the pattern of the dominance. In addition, to appropriately deal with the sparse nature of the data, we apply high-dimensional methods to efficiently select relevant data.

Keywords: supermodularity, supermodular dominance, stochastic dominance, Monte Carlo simulation, bootstrap, subsampling

Procedia PDF Downloads 119
358 Environmental Radioactivity Analysis by a Sequential Approach

Authors: G. Medkour Ishak-Boushaki, A. Taibi, M. Allab

Abstract:

Quantitative environmental radioactivity measurements are needed to determine the level of exposure of a population to ionizing radiations and for the assessment of the associated risks. Gamma spectrometry remains a very powerful tool for the analysis of radionuclides present in an environmental sample but the basic problem in such measurements is the low rate of detected events. Using large environmental samples could help to get around this difficulty but, unfortunately, new issues are raised by gamma rays attenuation and self-absorption. Recently, a new method has been suggested, to detect and identify without quantification, in a short time, a gamma ray of a low count source. This method does not require, as usually adopted in gamma spectrometry measurements, a pulse height spectrum acquisition. It is based on a chronological record of each detected photon by simultaneous measurements of its energy ε and its arrival time τ on the detector, the pair parameters [ε,τ] defining an event mode sequence (EMS). The EMS serials are analyzed sequentially by a Bayesian approach to detect the presence of a given radioactive source. The main object of the present work is to test the applicability of this sequential approach in radioactive environmental materials detection. Moreover, for an appropriate health oversight of the public and of the concerned workers, the analysis has been extended to get a reliable quantification of the radionuclides present in environmental samples. For illustration, we consider as an example, the problem of detection and quantification of 238U. Monte Carlo simulated experience is carried out consisting in the detection, by a Ge(Hp) semiconductor junction, of gamma rays of 63 keV emitted by 234Th (progeny of 238U). The generated EMS serials are analyzed by a Bayesian inference. The application of the sequential Bayesian approach, in environmental radioactivity analysis, offers the possibility of reducing the measurements time without requiring large environmental samples and consequently avoids the attached inconvenient. The work is still in progress.

Keywords: Bayesian approach, event mode sequence, gamma spectrometry, Monte Carlo method

Procedia PDF Downloads 484
357 Self-Image of Police Officers

Authors: Leo Carlo B. Rondina

Abstract:

Self-image is an important factor to improve the self-esteem of the personnel. The purpose of the study is to determine the self-image of the police. The respondents were the 503 policemen assigned in different Police Station in Davao City, and they were chosen with the used of random sampling. With the used of Exploratory Factor Analysis (EFA), latent construct variables of police image were identified as follows; professionalism, obedience, morality and justice and fairness. Further, ordinal regression indicates statistical characteristics on ages 21-40 which means the age of the respondent statistically improves self-image.

Keywords: police image, exploratory factor analysis, ordinal regression, Galatea effect

Procedia PDF Downloads 270
356 Use of Benin Laterites for the Mix Design of Structural Concrete

Authors: Yemalin D. Agossou, Andre Lecomte, Remi Boissiere, Edmond C. Adjovi, Abdelouahab Khelil

Abstract:

This paper presents a mixed design trial of structural concretes with laterites from Benin. These materials are often the only granular resources readily available in many tropical regions. In the first step, concretes were designed with raw laterites, but the performances obtained were rather disappointing in spite of high cement dosages. A detailed physical characterization of these materials then showed that they contained a significant proportion of fine clays and that the coarsest fraction (gravel) contained a variety of facies, some of which were not very dense or indurated. Washing these laterites, and even the elimination of the most friable grains of the gravel fraction, made it possible to obtain concretes with satisfactory properties in terms of workability, density and mechanical strength. However, they were found to be slightly less stiff than concretes made with more traditional aggregates. It is, therefore, possible to obtain structural concretes with only laterites and cement but at the cost of eliminating some of their granular constituents.

Keywords: laterites, aggregates, concretes, mix design, mechanical properties

Procedia PDF Downloads 149
355 Comparison of Water Equivalent Ratio of Several Dosimetric Materials in Proton Therapy Using Monte Carlo Simulations and Experimental Data

Authors: M. R. Akbari , H. Yousefnia, E. Mirrezaei

Abstract:

Range uncertainties of protons are currently a topic of interest in proton therapy. Two of the parameters that are often used to specify proton range are water equivalent thickness (WET) and water equivalent ratio (WER). Since WER values for a specific material is nearly constant at different proton energies, it is a more useful parameter to compare. In this study, WER values were calculated for different proton energies in polymethyl methacrylate (PMMA), polystyrene (PS) and aluminum (Al) using FLUKA and TRIM codes. The results were compared with analytical, experimental and simulated SEICS code data obtained from the literature. In FLUKA simulation, a cylindrical phantom, 1000 mm in height and 300 mm in diameter, filled with the studied materials was simulated. A typical mono-energetic proton pencil beam in a wide range of incident energies usually applied in proton therapy (50 MeV to 225 MeV) impinges normally on the phantom. In order to obtain the WER values for the considered materials, cylindrical detectors, 1 mm in height and 20 mm in diameter, were also simulated along the beam trajectory in the phantom. In TRIM calculations, type of projectile, energy and angle of incidence, type of target material and thickness should be defined. The mode of 'detailed calculation with full damage cascades' was selected for proton transport in the target material. The biggest difference in WER values between the codes was 3.19%, 1.9% and 0.67% for Al, PMMA and PS, respectively. In Al and PMMA, the biggest difference between each code and experimental data was 1.08%, 1.26%, 2.55%, 0.94%, 0.77% and 0.95% for SEICS, FLUKA and SRIM, respectively. FLUKA and SEICS had the greatest agreement (≤0.77% difference in PMMA and ≤1.08% difference in Al, respectively) with the available experimental data in this study. It is concluded that, FLUKA and TRIM codes have capability for Bragg curves simulation and WER values calculation in the studied materials. They can also predict Bragg peak location and range of proton beams with acceptable accuracy.

Keywords: water equivalent ratio, dosimetric materials, proton therapy, Monte Carlo simulations

Procedia PDF Downloads 304
354 Alumina Generated by Electrocoagulation as Adsorbent for the Elimination of the Iron from Drilling Water

Authors: Aimad Oulebsir, Toufik Chaabane, Venkataraman Sivasankar, André Darchen, Titus A. M. Msagati

Abstract:

Currently, the presence of pharmaceutical substances in the environment is an emerging pollution leading to the disruption of ecosystems. Indeed, water loaded with pharmaceutical residues is an issue that has raised the attention of researchers. The aim of this study was to monitor the effectiveness of the alumina electro-generated by the adsorption process the iron of well water for the production of drugs. The Fe2+ was removed from wastewater by adsorption in a batch cell. Performance results of iron removal by alumina electro-generated revealed that the efficiency of the carrier in the method of electro-generated adsorption. The overall Fe2+ of the synthetically solutions and simulated effluent removal efficiencies reached 75% and 65%, respectively. The application of models and isothermal adsorption kinetics complement the results obtained experimentally. Desorption of iron was investigated using a solution of 0.1M NaOH. Regeneration of the tests shows that the adsorbent maintains its capacity after five adsorption/desorption cycles.

Keywords: electrocoagulation, aluminum electrode, electrogenerated alumina, iron, adsorption/desorption

Procedia PDF Downloads 283
353 Preparation of Nanocomposites Based on Biodegradable Polycaprolactone by Melt Mixture

Authors: Mohamed Amine Zenasni, Bahia Meroufel, André Merlin, Said Benfarhi, Stéphane Molina, Béatrice George

Abstract:

The introduction of nano-fillers into polymers field lead to the creation of the nano composites. This creation is starting up a new revolution into the world of materials. Nano composites are similar to traditional composite of a polymer blend and filler with at least one nano-scopic dimension. In our project, we worked with nano composites of biodegradable polymer: polycaprolactone, combined with nano-clay (Maghnite) and with different nano-organo-clays. These nano composites have been prepared by melt mixture method. The advantage of this polymer is its degradability and bio compatibility. A study of the relationship between development, micro structure and physico chemical properties of nano composites, clays modified with 3-aminopropyltriethoxysilane (APTES) and Hexadecyltriméthy ammonium bromide (CTAB) and untreated clays were made. Melt mixture method is most suitable methods to get a better dispersion named exfoliation.

Keywords: nanocomposite, biodegradable, polycaprolactone, maghnite, melt mixture, APTES, CTAB

Procedia PDF Downloads 417
352 Numerical Investigation of Heat Transfer Characteristics of Different Rib Shapes in a Gas Turbine Blade

Authors: Naik Nithesh, Andre Rozek

Abstract:

The heat transfer and friction loss performances of a single rib-roughened rectangular cooling channel having four novel rib shapes were evaluated through numerical investigation using Ansys CFX. The investigation was conducted on a rectangular channel of aspect ratio (AR) = 4:1 with rib height to hydraulic diameter ratio (e/Dh) of 0.1 and rib pitch to height ratio (e/P) of 10 at Re = 30,000. The computations were performed by solving the RANS equation using k-ε turbulence model. Fluid flow simulation results of stationery case for different configuration are presented in terms of thermal performance parameter, Nusselt number and friction factor. These parameters indicate that a particular configuration of novel shaped ribs provides better heat transfer characteristics over the conventional 45° ribs. The numerical investigation undertaken in this study indicates an increase in overall efficiency of gas turbine due to increased thermal performance parameter, heat transfer co-efficient and less pumping pressure.

Keywords: gas turbine, rib shapes, nusselt number, thermal performance parameter

Procedia PDF Downloads 504
351 Cell Response on the Ti-15Mo Alloy Surface after Nanotubes Growth

Authors: Ana Paula Rosifini Alves Claro, André Luiz Reis Rangel, Nathan Trujillo, Ketul C. Popat

Abstract:

In the present work, in vitro cytotoxicity was evaluated after nanotubes growth on Ti15Mo alloy surface. TiO2 nanotubes were obtained by anodizing technique at room temperature in an electrolyte with 0.25 %NH4F and glycerol at a constant anodic potential of 20 V for 24 hours. The morphology of nanotubes was observed by field emission scanning electron microscopy (FE-SEM; XL 30 FEG, Philips). Crystal structure was analyzed by wide-angle X-ray diffraction. A cell culture model using human fibroblast-like cells was used to study the effect of TiO2 nanotubes growth on the cytotoxicity of the Ti15Mo alloy for 1, 4 and 7 days culture period. The MTT assay was used to evaluate cell viability and cell adhesion was evaluated by scanning electron microscopy. Results show that Ti15Mo alloy with TiO2 nanotubes on surface is nontoxic and exhibit good interaction with surface.

Keywords: titanium alloys, TiO2 nanotubes, cell growth, Ti-15Mo alloy

Procedia PDF Downloads 477
350 Measurement and Simulation of Axial Neutron Flux Distribution in Dry Tube of KAMINI Reactor

Authors: Manish Chand, Subhrojit Bagchi, R. Kumar

Abstract:

A new dry tube (DT) has been installed in the tank of KAMINI research reactor, Kalpakkam India. This tube will be used for neutron activation analysis of small to large samples and testing of neutron detectors. DT tube is 375 cm height and 7.5 cm in diameter, located 35 cm away from the core centre. The experimental thermal flux at various axial positions inside the tube has been measured by irradiating the flux monitor (¹⁹⁷Au) at 20kW reactor power. The measured activity of ¹⁹⁸Au and the thermal cross section of ¹⁹⁷Au (n,γ) ¹⁹⁸Au reaction were used for experimental thermal flux measurement. The flux inside the tube varies from 10⁹ to 10¹⁰ and maximum flux was (1.02 ± 0.023) x10¹⁰ n cm⁻²s⁻¹ at 36 cm from the bottom of the tube. The Au and Zr foils without and with cadmium cover of 1-mm thickness were irradiated at the maximum flux position in the DT to find out the irradiation specific input parameters like sub-cadmium to epithermal neutron flux ratio (f) and the epithermal neutron flux shape factor (α). The f value was 143 ± 5, indicates about 99.3% thermal neutron component and α value was -0.2886 ± 0.0125, indicates hard epithermal neutron spectrum due to insufficient moderation. The measured flux profile has been validated using theoretical model of KAMINI reactor through Monte Carlo N-Particle Code (MCNP). In MCNP, the complex geometry of the entire reactor is modelled in 3D, ensuring minimum approximations for all the components. Continuous energy cross-section data from ENDF-B/VII.1 as well as S (α, β) thermal neutron scattering functions are considered. The neutron flux has been estimated at the corresponding axial locations of the DT using mesh tally. The thermal flux obtained from the experiment shows good agreement with the theoretically predicted values by MCNP, it was within ± 10%. It can be concluded that this MCNP model can be utilized for calculating other important parameters like neutron spectra, dose rate, etc. and multi elemental analysis can be carried out by irradiating the sample at maximum flux position using measured f and α parameters by k₀-NAA standardization.

Keywords: neutron flux, neutron activation analysis, neutron flux shape factor, MCNP, Monte Carlo N-Particle Code

Procedia PDF Downloads 145
349 Dose Determination of Tenebrio molitor (Mealworm) Extract as an Anti-Diabetic Agent

Authors: Muhammad Al Rizqi Dharma Fauzi, Dwi Yulian Fahruddin Shah, Andre Pratama, Ari Hasna Widyapuspa, Ganden Supriyanto

Abstract:

Diabetes mellitus is still known as one of diseases which give a big amount of death in the world. From 2012 to 2014, diabetes is estimated to have resulted in 1.5 to 4.9 million deaths each year. In this paper, we present our research in the analysis and dose determination of Tenebrio molitor (Mealworm) extract as an anti-diabetic agent which is believed by Indonesian people as a traditional treatment to prevent and treat diabetes. We found that Tenebrio molitor extract has a potential as an anti-diabetic agent by in vitro test to Mus musculus which were divided into six group of treatment. Our dose determination analysis gave a conclusion that at 2,5 g/mL of concentration of the extract would give the optimal result in healing a wound given to Mus musculus which were induced by aloxane monohydrate. These results show that Tenebrio molitor extract is potential to be used as an Anti-Diabetic agent.

Keywords: diabetes, extraction, Tenebrio molitor, traditional medicine

Procedia PDF Downloads 376
348 Reliability Levels of Reinforced Concrete Bridges Obtained by Mixing Approaches

Authors: Adrián D. García-Soto, Alejandro Hernández-Martínez, Jesús G. Valdés-Vázquez, Reyna A. Vizguerra-Alvarez

Abstract:

Reinforced concrete bridges designed by code are intended to achieve target reliability levels adequate for the geographical environment where the code is applicable. Several methods can be used to estimate such reliability levels. Many of them require the establishment of an explicit limit state function (LSF). When such LSF is not available as a close-form expression, the simulation techniques are often employed. The simulation methods are computing intensive and time consuming. Note that if the reliability of real bridges designed by code is of interest, numerical schemes, the finite element method (FEM) or computational mechanics could be required. In these cases, it can be quite difficult (or impossible) to establish a close-form of the LSF, and the simulation techniques may be necessary to compute reliability levels. To overcome the need for a large number of simulations when no explicit LSF is available, the point estimate method (PEM) could be considered as an alternative. It has the advantage that only the probabilistic moments of the random variables are required. However, in the PEM, fitting of the resulting moments of the LSF to a probability density function (PDF) is needed. In the present study, a very simple alternative which allows the assessment of the reliability levels when no explicit LSF is available and without the need of extensive simulations is employed. The alternative includes the use of the PEM, and its applicability is shown by assessing reliability levels of reinforced concrete bridges in Mexico when a numerical scheme is required. Comparisons with results by using the Monte Carlo simulation (MCS) technique are included. To overcome the problem of approximating the probabilistic moments from the PEM to a PDF, a well-known distribution is employed. The approach mixes the PEM and other classic reliability method (first order reliability method, FORM). The results in the present study are in good agreement whit those computed with the MCS. Therefore, the alternative of mixing the reliability methods is a very valuable option to determine reliability levels when no close form of the LSF is available, or if numerical schemes, the FEM or computational mechanics are employed.

Keywords: structural reliability, reinforced concrete bridges, combined approach, point estimate method, monte carlo simulation

Procedia PDF Downloads 338
347 Probabilistic and Stochastic Analysis of a Retaining Wall for C-Φ Soil Backfill

Authors: André Luís Brasil Cavalcante, Juan Felix Rodriguez Rebolledo, Lucas Parreira de Faria Borges

Abstract:

A methodology for the probabilistic analysis of active earth pressure on retaining wall for c-Φ soil backfill is described in this paper. The Rosenblueth point estimate method is used to measure the failure probability of a gravity retaining wall. The basic principle of this methodology is to use two point estimates, i.e., the standard deviation and the mean value, to examine a variable in the safety analysis. The simplicity of this framework assures to its wide application. For the calculation is required 2ⁿ repetitions during the analysis, since the system is governed by n variables. In this study, a probabilistic model based on the Rosenblueth approach for the computation of the overturning probability of failure of a retaining wall is presented. The obtained results have shown the advantages of this kind of models in comparison with the deterministic solution. In a relatively easy way, the uncertainty on the wall and fill parameters are taken into account, and some practical results can be obtained for the retaining structure design.

Keywords: retaining wall, active earth pressure, backfill, probabilistic analysis

Procedia PDF Downloads 401