Search results for: Emmanuel Carlo Belara
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 667

Search results for: Emmanuel Carlo Belara

517 Complete Tripartite Graphs with Spanning Maximal Planar Subgraphs

Authors: Severino Gervacio, Velimor Almonte, Emmanuel Natalio

Abstract:

A simple graph is planar if it there is a way of drawing it in the plane without edge crossings. A planar graph which is not a proper spanning subgraph of another planar graph is a maximal planar graph. We prove that for complete tripartite graphs of order at most 9, the only ones that contain a spanning maximal planar subgraph are K1,1,1, K2,2,2, K2,3,3, and K3,3,3. The main result gives a necessary and sufficient condition for the complete tripartite graph Kx,y,z to contain a spanning maximal planar subgraph.

Keywords: complete tripartite graph, graph, maximal planar graph, planar graph, subgraph

Procedia PDF Downloads 381
516 New Technique of Estimation of Charge Carrier Density of Nanomaterials from Thermionic Emission Data

Authors: Dilip K. De, Olukunle C. Olawole, Emmanuel S. Joel, Moses Emetere

Abstract:

A good number of electronic properties such as electrical and thermal conductivities depend on charge carrier densities of nanomaterials. By controlling the charge carrier densities during the fabrication (or growth) processes, the physical properties can be tuned. In this paper, we discuss a new technique of estimating the charge carrier densities of nanomaterials from the thermionic emission data using the newly modified Richardson-Dushman equation. We find that the technique yields excellent results for graphene and carbon nanotube.

Keywords: charge carrier density, nano materials, new technique, thermionic emission

Procedia PDF Downloads 321
515 The Impact of Dispatching with Rolling Horizon Control in Sizing Thermal Storage for Solar Tower Plant Participating in Wholesale Spot Electricity Market

Authors: Navid Mohammadzadeh, Huy Truong-Ba, Michael Cholette

Abstract:

The solar tower (ST) plant is a promising technology to exploit large-scale solar irradiation. With thermal energy storage, ST plant has the potential to shift generation to high electricity price periods. However, the size of storage limits the dispatchability of the plant, particularly when it should compete with uncertainty in forecasts of solar irradiation and electricity prices. The purpose of this study is to explore the size of storage when Rolling Horizon Control (RHC) is employed for dispatch scheduling. To this end, RHC is benchmarked against perfect knowledge (PK) forecast and two day-ahead dispatching policies. With optimisation of dispatch planning using PK policy, the optimal achievable profit for a specific size of the storage is determined. A sensitivity analysis using Monte-Carlo simulation is conducted, and the size of storage for RHC and day-ahead policies is determined with the objective of reaching the profit obtained from the PK policy. A case study is conducted for a hypothetical ST plant with thermal storage located in South Australia and intends to dispatch under two market scenarios: 1) fixed price and 2) wholesale spot price. The impact of each individual source of uncertainty on storage size is examined for January and August. The exploration of results shows that dispatching with RH controller reaches optimal achievable profit with ~15% smaller storage compared to that in day-ahead policies. The results of this study may be applied to the CSP plant design procedure.

Keywords: solar tower plant, spot market, thermal storage system, optimized dispatch planning, sensitivity analysis, Monte Carlo simulation

Procedia PDF Downloads 125
514 Optimal Design of Tuned Inerter Damper-Based System for the Control of Wind-Induced Vibration in Tall Buildings through Cultural Algorithm

Authors: Luis Lara-Valencia, Mateo Ramirez-Acevedo, Daniel Caicedo, Jose Brito, Yosef Farbiarz

Abstract:

Controlling wind-induced vibrations as well as aerodynamic forces, is an essential part of the structural design of tall buildings in order to guarantee the serviceability limit state of the structure. This paper presents a numerical investigation on the optimal design parameters of a Tuned Inerter Damper (TID) based system for the control of wind-induced vibration in tall buildings. The control system is based on the conventional TID, with the main difference that its location is changed from the ground level to the last two story-levels of the structural system. The TID tuning procedure is based on an evolutionary cultural algorithm in which the optimum design variables defined as the frequency and damping ratios were searched according to the optimization criteria of minimizing the root mean square (RMS) response of displacements at the nth story of the structure. A Monte Carlo simulation was used to represent the dynamic action of the wind in the time domain in which a time-series derived from the Davenport spectrum using eleven harmonic functions with randomly chosen phase angles was reproduced. The above-mentioned methodology was applied on a case-study derived from a 37-story prestressed concrete building with 144 m height, in which the wind action overcomes the seismic action. The results showed that the optimally tuned TID is effective to reduce the RMS response of displacements up to 25%, which demonstrates the feasibility of the system for the control of wind-induced vibrations in tall buildings.

Keywords: evolutionary cultural algorithm, Monte Carlo simulation, tuned inerter damper, wind-induced vibrations

Procedia PDF Downloads 135
513 Passive Vibration Isolation Analysis and Optimization for Mechanical Systems

Authors: Ozan Yavuz Baytemir, Ender Cigeroglu, Gokhan Osman Ozgen

Abstract:

Vibration is an important issue in the design of various components of aerospace, marine and vehicular applications. In order not to lose the components’ function and operational performance, vibration isolation design involving the optimum isolator properties selection and isolator positioning processes appear to be a critical study. Knowing the growing need for the vibration isolation system design, this paper aims to present two types of software capable of implementing modal analysis, response analysis for both random and harmonic types of excitations, static deflection analysis, Monte Carlo simulations in addition to study of parameter and location optimization for different types of isolation problem scenarios. Investigating the literature, there is no such study developing a software-based tool that is capable of implementing all those analysis, simulation and optimization studies in one platform simultaneously. In this paper, the theoretical system model is generated for a 6-DOF rigid body. The vibration isolation system of any mechanical structure is able to be optimized using hybrid method involving both global search and gradient-based methods. Defining the optimization design variables, different types of optimization scenarios are listed in detail. Being aware of the need for a user friendly vibration isolation problem solver, two types of graphical user interfaces (GUIs) are prepared and verified using a commercial finite element analysis program, Ansys Workbench 14.0. Using the analysis and optimization capabilities of those GUIs, a real application used in an air-platform is also presented as a case study at the end of the paper.

Keywords: hybrid optimization, Monte Carlo simulation, multi-degree-of-freedom system, parameter optimization, location optimization, passive vibration isolation analysis

Procedia PDF Downloads 565
512 Central African Republic Government Recruitment Agency Based on Identity Management and Public Key Encryption

Authors: Koyangbo Guere Monguia Michel Alex Emmanuel

Abstract:

In e-government and especially recruitment, many researches have been conducted to build a trustworthy and reliable online or application system capable to process users or job applicant files. In this research (Government Recruitment Agency), cloud computing, identity management and public key encryption have been used to management domains, access control authorization mechanism and to secure data exchange between entities for reliable procedure of processing files.

Keywords: cloud computing network, identity management systems, public key encryption, access control and authorization

Procedia PDF Downloads 358
511 Foundation Settlement Determination: A Simplified Approach

Authors: Adewoyin O. Olusegun, Emmanuel O. Joshua, Marvel L. Akinyemi

Abstract:

The heterogeneous nature of the subsurface requires the use of factual information to deal with rather than assumptions or generalized equations. Therefore, there is need to determine the actual rate of settlement possible in the soil before structures are built on it. This information will help in determining the type of foundation design and the kind of reinforcement that will be necessary in constructions. This paper presents a simplified and a faster approach for determining foundation settlement in any type of soil using real field data acquired from seismic refraction techniques and cone penetration tests. This approach was also able to determine the depth of settlement of each strata of soil. The results obtained revealed the different settlement time and depth of settlement possible.

Keywords: heterogeneous, settlement, foundation, seismic, technique

Procedia PDF Downloads 445
510 Numerical Response of Coaxial HPGe Detector for Skull and Knee Measurement

Authors: Pabitra Sahu, M. Manohari, S. Priyadharshini, R. Santhanam, S. Chandrasekaran, B. Venkatraman

Abstract:

Radiation workers of reprocessing plants have a potential for internal exposure due to actinides and fission products. Radionuclides like Americium, lead, Polonium and Europium are bone seekers and get accumulated in the skeletal part. As the major skeletal content is in the skull (13%) and knee (22%), measurements of old intake have to be carried out in the skull and knee. At the Indira Gandhi Centre for Atomic Research, a twin HPGe-based actinide monitor is used for the measurement of actinides present in bone. Efficiency estimation, which is one of the prerequisites for the quantification of radionuclides, requires anthropomorphic phantoms. Such phantoms are very limited. Hence, in this study, efficiency curves for a Twin HPGe-based actinide monitoring system are established theoretically using the FLUKA Monte Carlo method and ICRP adult male voxel phantom. In the case of skull measurement, the detector is placed over the forehead, and for knee measurement, one detector is placed over each knee. The efficiency values of radionuclides present in the knee and skull vary from 3.72E-04 to 4.19E-04 CPS/photon and 5.22E-04 to 7.07E-04 CPS/photon, respectively, for the energy range 17 to 3000keV. The efficiency curves for the measurement are established, and it is found that initially, the efficiency value increases up to 100 keV and then starts decreasing. It is found that the skull efficiency values are 4% to 63% higher than that of the knee, depending on the energy for all the energies except 17.74 keV. The reason is the closeness of the detector to the skull compared to the knee. But for 17.74 keV the efficiency of the knee is more than the skull due to the higher attenuation caused in the skull bones because of its greater thickness. The Minimum Detectable Activity (MDA) for 241Am present in the skull and knee is 9 Bq. 239Pu has a MDA of 950 Bq and 1270 Bq for knee and skull, respectively, for a counting time of 1800 sec. This paper discusses the simulation method and the results obtained in the study.

Keywords: FLUKA Monte Carlo Method, ICRP adult male voxel phantom, knee, Skull.

Procedia PDF Downloads 51
509 Quasi-Photon Monte Carlo on Radiative Heat Transfer: An Importance Sampling and Learning Approach

Authors: Utkarsh A. Mishra, Ankit Bansal

Abstract:

At high temperature, radiative heat transfer is the dominant mode of heat transfer. It is governed by various phenomena such as photon emission, absorption, and scattering. The solution of the governing integrodifferential equation of radiative transfer is a complex process, more when the effect of participating medium and wavelength properties are taken into consideration. Although a generic formulation of such radiative transport problem can be modeled for a wide variety of problems with non-gray, non-diffusive surfaces, there is always a trade-off between simplicity and accuracy of the problem. Recently, solutions of complicated mathematical problems with statistical methods based on randomization of naturally occurring phenomena have gained significant importance. Photon bundles with discrete energy can be replicated with random numbers describing the emission, absorption, and scattering processes. Photon Monte Carlo (PMC) is a simple, yet powerful technique, to solve radiative transfer problems in complicated geometries with arbitrary participating medium. The method, on the one hand, increases the accuracy of estimation, and on the other hand, increases the computational cost. The participating media -generally a gas, such as CO₂, CO, and H₂O- present complex emission and absorption spectra. To model the emission/absorption accurately with random numbers requires a weighted sampling as different sections of the spectrum carries different importance. Importance sampling (IS) was implemented to sample random photon of arbitrary wavelength, and the sampled data provided unbiased training of MC estimators for better results. A better replacement to uniform random numbers is using deterministic, quasi-random sequences. Halton, Sobol, and Faure Low-Discrepancy Sequences are used in this study. They possess better space-filling performance than the uniform random number generator and gives rise to a low variance, stable Quasi-Monte Carlo (QMC) estimators with faster convergence. An optimal supervised learning scheme was further considered to reduce the computation costs of the PMC simulation. A one-dimensional plane-parallel slab problem with participating media was formulated. The history of some randomly sampled photon bundles is recorded to train an Artificial Neural Network (ANN), back-propagation model. The flux was calculated using the standard quasi PMC and was considered to be the training target. Results obtained with the proposed model for the one-dimensional problem are compared with the exact analytical and PMC model with the Line by Line (LBL) spectral model. The approximate variance obtained was around 3.14%. Results were analyzed with respect to time and the total flux in both cases. A significant reduction in variance as well a faster rate of convergence was observed in the case of the QMC method over the standard PMC method. However, the results obtained with the ANN method resulted in greater variance (around 25-28%) as compared to the other cases. There is a great scope of machine learning models to help in further reduction of computation cost once trained successfully. Multiple ways of selecting the input data as well as various architectures will be tried such that the concerned environment can be fully addressed to the ANN model. Better results can be achieved in this unexplored domain.

Keywords: radiative heat transfer, Monte Carlo Method, pseudo-random numbers, low discrepancy sequences, artificial neural networks

Procedia PDF Downloads 223
508 Role of Spatial Variability in the Service Life Prediction of Reinforced Concrete Bridges Affected by Corrosion

Authors: Omran M. Kenshel, Alan J. O'Connor

Abstract:

Estimating the service life of Reinforced Concrete (RC) bridge structures located in corrosive marine environments of a great importance to their owners/engineers. Traditionally, bridge owners/engineers relied more on subjective engineering judgment, e.g. visual inspection, in their estimation approach. However, because financial resources are often limited, rational calculation methods of estimation are needed to aid in making reliable and more accurate predictions for the service life of RC structures. This is in order to direct funds to bridges found to be the most critical. Criticality of the structure can be considered either form the Structural Capacity (i.e. Ultimate Limit State) or from Serviceability viewpoint whichever is adopted. This paper considers the service life of the structure only from the Structural Capacity viewpoint. Considering the great variability associated with the parameters involved in the estimation process, the probabilistic approach is most suited. The probabilistic modelling adopted here used Monte Carlo simulation technique to estimate the Reliability (i.e. Probability of Failure) of the structure under consideration. In this paper the authors used their own experimental data for the Correlation Length (CL) for the most important deterioration parameters. The CL is a parameter of the Correlation Function (CF) by which the spatial fluctuation of a certain deterioration parameter is described. The CL data used here were produced by analyzing 45 chloride profiles obtained from a 30 years old RC bridge located in a marine environment. The service life of the structure were predicted in terms of the load carrying capacity of an RC bridge beam girder. The analysis showed that the influence of SV is only evident if the reliability of the structure is governed by the Flexure failure rather than by the Shear failure.

Keywords: Chloride-induced corrosion, Monte-Carlo simulation, reinforced concrete, spatial variability

Procedia PDF Downloads 473
507 Reliability Analysis of Variable Stiffness Composite Laminate Structures

Authors: A. Sohouli, A. Suleman

Abstract:

This study focuses on reliability analysis of variable stiffness composite laminate structures to investigate the potential structural improvement compared to conventional (straight fibers) composite laminate structures. A computational framework was developed which it consists of a deterministic design step and reliability analysis. The optimization part is Discrete Material Optimization (DMO) and the reliability of the structure is computed by Monte Carlo Simulation (MCS) after using Stochastic Response Surface Method (SRSM). The design driver in deterministic optimization is the maximum stiffness, while optimization method concerns certain manufacturing constraints to attain industrial relevance. These manufacturing constraints are the change of orientation between adjacent patches cannot be too large and the maximum number of successive plies of a particular fiber orientation should not be too high. Variable stiffness composites may be manufactured by Automated Fiber Machines (AFP) which provides consistent quality with good production rates. However, laps and gaps are the most important challenges to steer fibers that effect on the performance of the structures. In this study, the optimal curved fiber paths at each layer of composites are designed in the first step by DMO, and then the reliability analysis is applied to investigate the sensitivity of the structure with different standard deviations compared to the straight fiber angle composites. The random variables are material properties and loads on the structures. The results show that the variable stiffness composite laminate structures are much more reliable, even for high standard deviation of material properties, than the conventional composite laminate structures. The reason is that the variable stiffness composite laminates allow tailoring stiffness and provide the possibility of adjusting stress and strain distribution favorably in the structures.

Keywords: material optimization, Monte Carlo simulation, reliability analysis, response surface method, variable stiffness composite structures

Procedia PDF Downloads 520
506 Double Fourier Series Applied to Supraharmonic Determination: The Specific Cases of a Boost and an Interleaved Boost Converter Used as Active Power Factor Correctors

Authors: Erzen Muharemi, Emmanuel De Jaeger, Jos Knockaert

Abstract:

The work presented here investigates the modeling of power electronics converters in terms of their harmonic production. Specifically, it addresses high-frequency emissions in the range of 2-150 kHz, referred to as supraharmonics. This paper models a conventional converter, namely the boost converter used as an active power factor corrector (APFC). Furthermore, the modeling is extended to the case of the interleaved boost converter, which offers advantages such as halving the emissions. Finally, a comparison between the theoretical, numerical, and experimental results will be provided.

Keywords: APFC, boost converter, converter modeling, double fourier series, supraharmonics

Procedia PDF Downloads 42
505 Assessment of Weaver Birds and Their Allies Within and Around Ngel-Nyaki Forest Reserve, Yelwa, Sardauna LGA, Taraba State, Nigeria

Authors: David Delpine Leila, Demnyo Sunita Femi, Musa David Garkida, Elisha Emmanuel Barde, Emmanuel Allahnanan, Yani Julius Philip

Abstract:

Birds are among the key components of the earth’s biodiversity and the most diverse and evolutionarily successful groups of animals. The weaverbirds are a large family of birds found mostly in Africa, with a few species found in southern Asia and the West Indian Ocean islands. This study assessed the diversity and abundance of weaver birds and their allies within and around Ngel-Nyaki Forest Reserve in Yelwa, Sardauna Local Government Area of Taraba State, Nigeria. A total of 602 weaver birds and allies’ bird species were recorded using the Point Count Line Transect. The data collected during the research period were analyzed using simple percentages, and diversity was calculated using the Shannon Wiener Diversity Index. The fenced (ungrazed area) was more abundant with 351 individuals while the unfenced (grazed area) was less abundant with 251 individuals recorded. In the fenced (ungrazed area), Yellow Bishop (Euplectes capensis) had the highest abundance of (102; 29.01%), followed by Village Weaver (Ploceus cucullatus) (80; 22.79%), then Vieillot's Black Weaver (Ploceus nigerrimus) (40; 11.42%), Red-collard Widowbird (Ploceus ardens) (6; 1.71%), Dark-backed Weaver (5; 1.42%) and the least was Hartlaub Marsh Widowbird (1; 0.28%) while in the unfenced (grazed area), the Village weaver (Ploceus cucullatus) (85; 33.86%) was the most abundant, followed by Spectacled Weaver (Ploceus ocularis) (36; 14.34%), then Yellow Bishop (Euplectes capensis) (30; 11.95%), Baglefecht Weaver (Ploceus baglafecht) (23; 9.16%), Bannerman’s Weaver (Ploceus bannermani) (17; 6.77%) and the least was Yellow-mantled Widowbird (Euplectes macroura) (5; 1.99%). In terms of diversity, there were more weaver bird species in the fenced area with a Shannon Wiener Diversity Index of (Hˈ 2.03417) than in the unfenced area with a Shannon Wiener Diversity Index of (Hˈ 1.862671). The Shannon Wiener Diversity Index in both fenced and unfenced areas is significant. There was more abundance of bird species in the fenced area than in the unfenced area of the Forest Reserve. Thorough research should be conducted on the abundance and diversity of weavers and their allies because we were only able to access 4 km2 out of 46 km2 of land available, according to the Annual Report of Ngel-Nyaki Forest Reserve of 2020. It shows that there are many species of weaver birds and their allies, such as the Black-billed Weaver (Ploceus melanogaster) and the Red-billed Quelea (Quelea quelea), which are available within the reserve.

Keywords: abundance, diversity, weaver birds, allies, Ngel-Nyaki

Procedia PDF Downloads 71
504 Development of a Robust Protein Classifier to Predict EMT Status of Cervical Squamous Cell Carcinoma and Endocervical Adenocarcinoma (CESC) Tumors

Authors: ZhenlinJu, Christopher P. Vellano, RehanAkbani, Yiling Lu, Gordon B. Mills

Abstract:

The epithelial–mesenchymal transition (EMT) is a process by which epithelial cells acquire mesenchymal characteristics, such as profound disruption of cell-cell junctions, loss of apical-basolateral polarity, and extensive reorganization of the actin cytoskeleton to induce cell motility and invasion. A hallmark of EMT is its capacity to promote metastasis, which is due in part to activation of several transcription factors and subsequent downregulation of E-cadherin. Unfortunately, current approaches have yet to uncover robust protein marker sets that can classify tumors as possessing strong EMT signatures. In this study, we utilize reverse phase protein array (RPPA) data and consensus clustering methods to successfully classify a subset of cervical squamous cell carcinoma and endocervical adenocarcinoma (CESC) tumors into an EMT protein signaling group (EMT group). The overall survival (OS) of patients in the EMT group is significantly worse than those in the other Hormone and PI3K/AKT signaling groups. In addition to a shrinkage and selection method for linear regression (LASSO), we applied training/test set and Monte Carlo resampling approaches to identify a set of protein markers that predicts the EMT status of CESC tumors. We fit a logistic model to these protein markers and developed a classifier, which was fixed in the training set and validated in the testing set. The classifier robustly predicted the EMT status of the testing set with an area under the curve (AUC) of 0.975 by Receiver Operating Characteristic (ROC) analysis. This method not only identifies a core set of proteins underlying an EMT signature in cervical cancer patients, but also provides a tool to examine protein predictors that drive molecular subtypes in other diseases.

Keywords: consensus clustering, TCGA CESC, Silhouette, Monte Carlo LASSO

Procedia PDF Downloads 468
503 Consistent Testing for an Implication of Supermodular Dominance with an Application to Verifying the Effect of Geographic Knowledge Spillover

Authors: Chung Danbi, Linton Oliver, Whang Yoon-Jae

Abstract:

Supermodularity, or complementarity, is a popular concept in economics which can characterize many objective functions such as utility, social welfare, and production functions. Further, supermodular dominance captures a preference for greater interdependence among inputs of those functions, and it can be applied to examine which input set would produce higher expected utility, social welfare, or production. Therefore, we propose and justify a consistent testing for a useful implication of supermodular dominance. We also conduct Monte Carlo simulations to explore the finite sample performance of our test, with critical values obtained from the recentered bootstrap method, with and without the selective recentering, and the subsampling method. Under various parameter settings, we confirmed that our test has reasonably good size and power performance. Finally, we apply our test to compare the geographic and distant knowledge spillover in terms of their effects on social welfare using the National Bureau of Economic Research (NBER) patent data. We expect localized citing to supermodularly dominate distant citing if the geographic knowledge spillover engenders greater social welfare than distant knowledge spillover. Taking subgroups based on firm and patent characteristics, we found that there is industry-wise and patent subclass-wise difference in the pattern of supermodular dominance between localized and distant citing. We also compare the results from analyzing different time periods to see if the development of Internet and communication technology has changed the pattern of the dominance. In addition, to appropriately deal with the sparse nature of the data, we apply high-dimensional methods to efficiently select relevant data.

Keywords: supermodularity, supermodular dominance, stochastic dominance, Monte Carlo simulation, bootstrap, subsampling

Procedia PDF Downloads 129
502 Environmental Radioactivity Analysis by a Sequential Approach

Authors: G. Medkour Ishak-Boushaki, A. Taibi, M. Allab

Abstract:

Quantitative environmental radioactivity measurements are needed to determine the level of exposure of a population to ionizing radiations and for the assessment of the associated risks. Gamma spectrometry remains a very powerful tool for the analysis of radionuclides present in an environmental sample but the basic problem in such measurements is the low rate of detected events. Using large environmental samples could help to get around this difficulty but, unfortunately, new issues are raised by gamma rays attenuation and self-absorption. Recently, a new method has been suggested, to detect and identify without quantification, in a short time, a gamma ray of a low count source. This method does not require, as usually adopted in gamma spectrometry measurements, a pulse height spectrum acquisition. It is based on a chronological record of each detected photon by simultaneous measurements of its energy ε and its arrival time τ on the detector, the pair parameters [ε,τ] defining an event mode sequence (EMS). The EMS serials are analyzed sequentially by a Bayesian approach to detect the presence of a given radioactive source. The main object of the present work is to test the applicability of this sequential approach in radioactive environmental materials detection. Moreover, for an appropriate health oversight of the public and of the concerned workers, the analysis has been extended to get a reliable quantification of the radionuclides present in environmental samples. For illustration, we consider as an example, the problem of detection and quantification of 238U. Monte Carlo simulated experience is carried out consisting in the detection, by a Ge(Hp) semiconductor junction, of gamma rays of 63 keV emitted by 234Th (progeny of 238U). The generated EMS serials are analyzed by a Bayesian inference. The application of the sequential Bayesian approach, in environmental radioactivity analysis, offers the possibility of reducing the measurements time without requiring large environmental samples and consequently avoids the attached inconvenient. The work is still in progress.

Keywords: Bayesian approach, event mode sequence, gamma spectrometry, Monte Carlo method

Procedia PDF Downloads 495
501 Self-Image of Police Officers

Authors: Leo Carlo B. Rondina

Abstract:

Self-image is an important factor to improve the self-esteem of the personnel. The purpose of the study is to determine the self-image of the police. The respondents were the 503 policemen assigned in different Police Station in Davao City, and they were chosen with the used of random sampling. With the used of Exploratory Factor Analysis (EFA), latent construct variables of police image were identified as follows; professionalism, obedience, morality and justice and fairness. Further, ordinal regression indicates statistical characteristics on ages 21-40 which means the age of the respondent statistically improves self-image.

Keywords: police image, exploratory factor analysis, ordinal regression, Galatea effect

Procedia PDF Downloads 288
500 Marketing Implications and the Dynamics of Changing Gender Roles in Families

Authors: Kehinde Emmanuel Atanlusi

Abstract:

It is impossible to stifle the gust of social change as it makes its way through institutionalised hierarchies on its way to expressing itself. This advancement might also have repercussions for institutions, families, and politics, so modifying the norms and establishing new societal ideals. In the following paragraphs, it will explore how gender roles in the family have changed over time, how this has affected consumption, and how marketing has been influenced by these changes. It was decided to use the empirical research method, which led to several discoveries, one of which was that marketing in the pre-modern era was predicated on metanarratives and gender stereotypes. However, these aspects of marketing have undergone significant transformations in the post-modern era, which led to the formation of an assumption regarding what future marketing trends will be like. In spite of the fact that post-modern marketing methods have a number of drawbacks, it was suggested that these strategies be embraced and updated in the future in order to expand consumer bases and target audiences.

Keywords: Marketing, Gender Roles, Advertising, Decentralisation, Fragmentation

Procedia PDF Downloads 116
499 Comparison of Water Equivalent Ratio of Several Dosimetric Materials in Proton Therapy Using Monte Carlo Simulations and Experimental Data

Authors: M. R. Akbari , H. Yousefnia, E. Mirrezaei

Abstract:

Range uncertainties of protons are currently a topic of interest in proton therapy. Two of the parameters that are often used to specify proton range are water equivalent thickness (WET) and water equivalent ratio (WER). Since WER values for a specific material is nearly constant at different proton energies, it is a more useful parameter to compare. In this study, WER values were calculated for different proton energies in polymethyl methacrylate (PMMA), polystyrene (PS) and aluminum (Al) using FLUKA and TRIM codes. The results were compared with analytical, experimental and simulated SEICS code data obtained from the literature. In FLUKA simulation, a cylindrical phantom, 1000 mm in height and 300 mm in diameter, filled with the studied materials was simulated. A typical mono-energetic proton pencil beam in a wide range of incident energies usually applied in proton therapy (50 MeV to 225 MeV) impinges normally on the phantom. In order to obtain the WER values for the considered materials, cylindrical detectors, 1 mm in height and 20 mm in diameter, were also simulated along the beam trajectory in the phantom. In TRIM calculations, type of projectile, energy and angle of incidence, type of target material and thickness should be defined. The mode of 'detailed calculation with full damage cascades' was selected for proton transport in the target material. The biggest difference in WER values between the codes was 3.19%, 1.9% and 0.67% for Al, PMMA and PS, respectively. In Al and PMMA, the biggest difference between each code and experimental data was 1.08%, 1.26%, 2.55%, 0.94%, 0.77% and 0.95% for SEICS, FLUKA and SRIM, respectively. FLUKA and SEICS had the greatest agreement (≤0.77% difference in PMMA and ≤1.08% difference in Al, respectively) with the available experimental data in this study. It is concluded that, FLUKA and TRIM codes have capability for Bragg curves simulation and WER values calculation in the studied materials. They can also predict Bragg peak location and range of proton beams with acceptable accuracy.

Keywords: water equivalent ratio, dosimetric materials, proton therapy, Monte Carlo simulations

Procedia PDF Downloads 324
498 Potentials of Underutilised Crops in the Nigerian Farming Systems for Sustainable Food Production and Economic Empowerment

Authors: Jesse Silas Mshelia, Michael Mamman Degri, Akeweta Emmanuel Samaila

Abstract:

This review was conducted in the North-Eastern part of Nigeria where there are a lot of challenges of poverty and low level of productivity of farmlands as a result of dwindling soil fertility and dependence on crops that are not so much adopted to the soil and climatic condition and the prevailing farming systems of the area which is predominantly mixed cropping. The crops that are neglected are well fitted into this system of production and yield better with the low level of input and management and give a higher profit margin. These crops, the farmers have mastered the production techniques, but do not have the scientific knowledge to improve the quality of the seed and the products hence need the intervention of modern technologies to benefit maximally from the full potentials of these crops.

Keywords: farming systems, neglected crops, potentials, underutilised

Procedia PDF Downloads 375
497 Microwave Production of Geopolymers Using Fluidized Bed Combustion Bottom Ash

Authors: Osholana Tobi Stephen, Rotimi Emmanuel Sadiku, Bilainu Oboirien.o

Abstract:

Fluidized bed combustion (FBC) is a clean coal technology used in the combustion of low-grade coals for power generation. The production of large solid wastes such as bottom ashes from this process is a problem. The bottom ash contains some toxic elements which can leach out soils and contaminate surface and ground water; for this reason, they can neither be disposed in landfills nor lagoons anymore. The production of geopolymers from bottom ash for structural and concrete applications is an option for their disposal. In this study, the waste bottom ash obtained from the combustion of three low grade South African coals in a bubbling fluidized bed reactor was used to produce geopolymers. The geopolymers were cured in a household microwave. The results showed that the microwave curing enhanced the reactivity and strength of the geopolymers.

Keywords: bottom ash, coal, fluidized bed combustion (FBC) geopolymer, compressive strength

Procedia PDF Downloads 315
496 Measurement and Simulation of Axial Neutron Flux Distribution in Dry Tube of KAMINI Reactor

Authors: Manish Chand, Subhrojit Bagchi, R. Kumar

Abstract:

A new dry tube (DT) has been installed in the tank of KAMINI research reactor, Kalpakkam India. This tube will be used for neutron activation analysis of small to large samples and testing of neutron detectors. DT tube is 375 cm height and 7.5 cm in diameter, located 35 cm away from the core centre. The experimental thermal flux at various axial positions inside the tube has been measured by irradiating the flux monitor (¹⁹⁷Au) at 20kW reactor power. The measured activity of ¹⁹⁸Au and the thermal cross section of ¹⁹⁷Au (n,γ) ¹⁹⁸Au reaction were used for experimental thermal flux measurement. The flux inside the tube varies from 10⁹ to 10¹⁰ and maximum flux was (1.02 ± 0.023) x10¹⁰ n cm⁻²s⁻¹ at 36 cm from the bottom of the tube. The Au and Zr foils without and with cadmium cover of 1-mm thickness were irradiated at the maximum flux position in the DT to find out the irradiation specific input parameters like sub-cadmium to epithermal neutron flux ratio (f) and the epithermal neutron flux shape factor (α). The f value was 143 ± 5, indicates about 99.3% thermal neutron component and α value was -0.2886 ± 0.0125, indicates hard epithermal neutron spectrum due to insufficient moderation. The measured flux profile has been validated using theoretical model of KAMINI reactor through Monte Carlo N-Particle Code (MCNP). In MCNP, the complex geometry of the entire reactor is modelled in 3D, ensuring minimum approximations for all the components. Continuous energy cross-section data from ENDF-B/VII.1 as well as S (α, β) thermal neutron scattering functions are considered. The neutron flux has been estimated at the corresponding axial locations of the DT using mesh tally. The thermal flux obtained from the experiment shows good agreement with the theoretically predicted values by MCNP, it was within ± 10%. It can be concluded that this MCNP model can be utilized for calculating other important parameters like neutron spectra, dose rate, etc. and multi elemental analysis can be carried out by irradiating the sample at maximum flux position using measured f and α parameters by k₀-NAA standardization.

Keywords: neutron flux, neutron activation analysis, neutron flux shape factor, MCNP, Monte Carlo N-Particle Code

Procedia PDF Downloads 164
495 Cross-Cultural Evangelism a Necessity in Contemporary Times: A Case Study of Mission of Diocese on the Niger Anglican Communion to Togo

Authors: Nnatuanya Chinedu Emmanuel

Abstract:

The focus of this research is to point out the importance of mission across nations, tribes, and languages. This is because the message of the gospel is global in nature and as a result, Christians of nations, irrespective of color and denomination, must strive to ensure that this message of transformation is extended to all, notwithstanding their region, locality and color. It is in response to this that this work investigates the evangelization activity of the Diocese on the Niger in Togo, their impacts and activities. The framework of qualitative research was adopted while findings indicate that much work has been done in the areas of human and societal development; notwithstanding, the problem of funding, language barrier, and manpower become a threat to the mission work.

Keywords: cross–cultural Evangelism, diocese on the Niger, Anglican communion, Togo

Procedia PDF Downloads 90
494 Gaming Tools for Efficient Low Cost Urban Planning Using Nature Based Solutions

Authors: Ioannis Kavouras, Eftychios Protopapadakis, Emmanuel Sardis, Anastasios Doulamis

Abstract:

In this paper, we investigate the appropriateness and usability of three different free and open-source rendering tools for urban planning visualizations. The process involves the selection of a map area, the 3D rendering transformation, the addition of nature-based solution placement, and the evaluation and assessment of the suggested applied interventions. The manuscript uses a case study involved at Dilaveri Coast, Piraeus region, Greece. Research outcomes indicate that a Blender-OSM implementation is an appropriate tool capable of supporting high-fidelity urban planning, with quick and accurate visibility of related results for end users and involved in NBS transformations.

Keywords: urban planning, nature based solution, 3D gaming tools, game engine, free and open source

Procedia PDF Downloads 112
493 Reliability Levels of Reinforced Concrete Bridges Obtained by Mixing Approaches

Authors: Adrián D. García-Soto, Alejandro Hernández-Martínez, Jesús G. Valdés-Vázquez, Reyna A. Vizguerra-Alvarez

Abstract:

Reinforced concrete bridges designed by code are intended to achieve target reliability levels adequate for the geographical environment where the code is applicable. Several methods can be used to estimate such reliability levels. Many of them require the establishment of an explicit limit state function (LSF). When such LSF is not available as a close-form expression, the simulation techniques are often employed. The simulation methods are computing intensive and time consuming. Note that if the reliability of real bridges designed by code is of interest, numerical schemes, the finite element method (FEM) or computational mechanics could be required. In these cases, it can be quite difficult (or impossible) to establish a close-form of the LSF, and the simulation techniques may be necessary to compute reliability levels. To overcome the need for a large number of simulations when no explicit LSF is available, the point estimate method (PEM) could be considered as an alternative. It has the advantage that only the probabilistic moments of the random variables are required. However, in the PEM, fitting of the resulting moments of the LSF to a probability density function (PDF) is needed. In the present study, a very simple alternative which allows the assessment of the reliability levels when no explicit LSF is available and without the need of extensive simulations is employed. The alternative includes the use of the PEM, and its applicability is shown by assessing reliability levels of reinforced concrete bridges in Mexico when a numerical scheme is required. Comparisons with results by using the Monte Carlo simulation (MCS) technique are included. To overcome the problem of approximating the probabilistic moments from the PEM to a PDF, a well-known distribution is employed. The approach mixes the PEM and other classic reliability method (first order reliability method, FORM). The results in the present study are in good agreement whit those computed with the MCS. Therefore, the alternative of mixing the reliability methods is a very valuable option to determine reliability levels when no close form of the LSF is available, or if numerical schemes, the FEM or computational mechanics are employed.

Keywords: structural reliability, reinforced concrete bridges, combined approach, point estimate method, monte carlo simulation

Procedia PDF Downloads 346
492 Challenges with Synchrophasor Technology Deployments in Electric Power Grids

Authors: Emmanuel U. Oleka, Anil Khanal, Gary L. Lebby, Ali R. Osareh

Abstract:

Synchrophasor technology is fast being deployed in electric power grids all over the world and is fast changing the way the grids are managed. This trend is to continue until the entire power grids are fully connected so they can be monitored and controlled in real-time. Much achievement has been made in the synchrophasor technology development and deployment, and much more are yet to be achieved. Real-time power grid control and protection potentials of synchrophasor are yet to be explored. It is of necessity that researchers keep in view the various challenges that still need to be overcome in expanding the frontiers of synchrophasor technology. This paper outlines the major challenges that should be dealt with in order to achieve the goal of total power grid visualization, monitoring and control using synchrophasor technology.

Keywords: electric power grid, grid visualization, phasor measurement unit, synchrophasor

Procedia PDF Downloads 556
491 Consideration of Uncertainty in Engineering

Authors: A. Mohammadi, M. Moghimi, S. Mohammadi

Abstract:

Engineers need computational methods which could provide solutions less sensitive to the environmental effects, so the techniques should be used which take the uncertainty to account to control and minimize the risk associated with design and operation. In order to consider uncertainty in engineering problem, the optimization problem should be solved for a suitable range of the each uncertain input variable instead of just one estimated point. Using deterministic optimization problem, a large computational burden is required to consider every possible and probable combination of uncertain input variables. Several methods have been reported in the literature to deal with problems under uncertainty. In this paper, different methods presented and analyzed.

Keywords: uncertainty, Monte Carlo simulated, stochastic programming, scenario method

Procedia PDF Downloads 414
490 Distribution Network Optimization by Optimal Placement of Photovoltaic-Based Distributed Generation: A Case Study of the Nigerian Power System

Authors: Edafe Lucky Okotie, Emmanuel Osawaru Omosigho

Abstract:

This paper examines the impacts of the introduction of distributed energy generation (DEG) technology into the Nigerian power system as an alternative means of energy generation at distribution ends using Otovwodo 15 MVA, 33/11kV injection substation as a case study. The overall idea is to increase the generated energy in the system, improve the voltage profile and reduce system losses. A photovoltaic-based distributed energy generator (PV-DEG) was considered and was optimally placed in the network using Genetic Algorithm (GA) in Mat. Lab/Simulink environment. The results of simulation obtained shows that the dynamic performance of the network was optimized with DEG-grid integration.

Keywords: distributed energy generation (DEG), genetic algorithm (GA), power quality, total load demand, voltage profile

Procedia PDF Downloads 84
489 Probabilistic Life Cycle Assessment of the Nano Membrane Toilet

Authors: A. Anastasopoulou, A. Kolios, T. Somorin, A. Sowale, Y. Jiang, B. Fidalgo, A. Parker, L. Williams, M. Collins, E. J. McAdam, S. Tyrrel

Abstract:

Developing countries are nowadays confronted with great challenges related to domestic sanitation services in view of the imminent water scarcity. Contemporary sanitation technologies established in these countries are likely to pose health risks unless waste management standards are followed properly. This paper provides a solution to sustainable sanitation with the development of an innovative toilet system, called Nano Membrane Toilet (NMT), which has been developed by Cranfield University and sponsored by the Bill & Melinda Gates Foundation. The particular technology converts human faeces into energy through gasification and provides treated wastewater from urine through membrane filtration. In order to evaluate the environmental profile of the NMT system, a deterministic life cycle assessment (LCA) has been conducted in SimaPro software employing the Ecoinvent v3.3 database. The particular study has determined the most contributory factors to the environmental footprint of the NMT system. However, as sensitivity analysis has identified certain critical operating parameters for the robustness of the LCA results, adopting a stochastic approach to the Life Cycle Inventory (LCI) will comprehensively capture the input data uncertainty and enhance the credibility of the LCA outcome. For that purpose, Monte Carlo simulations, in combination with an artificial neural network (ANN) model, have been conducted for the input parameters of raw material, produced electricity, NOX emissions, amount of ash and transportation of fertilizer. The given analysis has provided the distribution and the confidence intervals of the selected impact categories and, in turn, more credible conclusions are drawn on the respective LCIA (Life Cycle Impact Assessment) profile of NMT system. Last but not least, the specific study will also yield essential insights into the methodological framework that can be adopted in the environmental impact assessment of other complex engineering systems subject to a high level of input data uncertainty.

Keywords: sanitation systems, nano-membrane toilet, lca, stochastic uncertainty analysis, Monte Carlo simulations, artificial neural network

Procedia PDF Downloads 225
488 Operational Advantages of Tungsten Inert Gas over Metal Inert Gas Welding Process

Authors: Emmanuel Ogundimu, Esther Akinlabi, Mutiu Erinosho

Abstract:

In this research, studies were done on the material characterization of type 304 austenitic stainless steel weld produced by TIG (Tungsten Inert Gas) and MIG (Metal Inert Gas) welding processes. This research is aimed to establish optimized process parameters that will result in a defect-free weld joint, homogenous distribution of the iron (Fe), chromium (Cr) and nickel (Ni) was observed at the welded joint of all the six samples. The welded sample produced at the current of 170 A by TIG welding process had the highest ultimate tensile strength (UTS) value of 621 MPa at the welds zone, and the welded sample produced by MIG process at the welding current of 150 A had the lowest UTS value of 568 MPa. However, it was established that TIG welding process is more appropriate for the welding of type 304 austenitic stainless steel compared to the MIG welding process.

Keywords: microhardness, microstructure, tensile, MIG welding, process, tensile, shear stress TIG welding, TIG-MIG welding

Procedia PDF Downloads 192