Search results for: analytical validation
745 Development of Soil Test Kits to Determine Organic Matter Available Phosphorus and Exchangeable Potassium in Thailand
Authors: Charirat Kusonwiriyawong, Supha Photichan, Wannarut Chutibutr
Abstract:
Soil test kits for rapid analysis of the organic matter, available phosphorus and exchangeable potassium were developed to drive a low-cost field testing kit to farmers. The objective was to provide a decision tool for improving soil fertility. One aspect of soil test kit development was ease of use which is a time requirement for completing organic matter, available phosphorus and exchangeable potassium test in one soil sample. This testing kit required only two extractions and utilized no filtration consuming approximately 15 minutes per sample. Organic matter was principally created by oxidizing carbon KMnO₄ using the standard color chart. In addition, modified single extractant (Mehlich I) was applied to extract available phosphorus and exchangeable potassium. Molybdenum blue method and turbidimetric method using standard color chart were adapted to analyze available phosphorus and exchangeable potassium, respectively. Modified single extractant using in soil test kits were highly significant matching with analytical laboratory results (r=0.959** and 0.945** for available phosphorus and exchangeable potassium, respectively). Linear regressions were statistically calculated between modified single extractant and standard laboratory analysis (y=0.9581x-12.973 for available phosphorus and y=0.5372x+15.283 for exchangeable potassium, respectively). These equations were calibrated to formulate a fertilizer rate recommendation for specific corps. To validate quality, soil test kits were distributed to farmers and extension workers. We found that the accuracy of soil test kits were 71.0%, 63.9% and 65.5% for organic matter, available phosphorus, and exchangeable potassium, respectively. The quantitative survey was also conducted in order to assess their satisfaction with soil test kits. The survey showed that more than 85% of respondents said these testing kits were more convenient, economical and reliable than the other commercial soil test kits. Based upon the finding of this study, soil test kits can be another alternative for providing soil analysis and fertility recommendations when a soil testing laboratory is not available.Keywords: available phosphorus, exchangeable potassium, modified single extractant, organic matter, soil test kits
Procedia PDF Downloads 146744 Terahertz Glucose Sensors Based on Photonic Crystal Pillar Array
Authors: S. S. Sree Sanker, K. N. Madhusoodanan
Abstract:
Optical biosensors are dominant alternative for traditional analytical methods, because of their small size, simple design and high sensitivity. Photonic sensing method is one of the recent advancing technology for biosensors. It measures the change in refractive index which is induced by the difference in molecular interactions due to the change in concentration of the analyte. Glucose is an aldosic monosaccharide, which is a metabolic source in many of the organisms. The terahertz waves occupies the space between infrared and microwaves in the electromagnetic spectrum. Terahertz waves are expected to be applied to various types of sensors for detecting harmful substances in blood, cancer cells in skin and micro bacteria in vegetables. We have designed glucose sensors using silicon based 1D and 2D photonic crystal pillar arrays in terahertz frequency range. 1D photonic crystal has rectangular pillars with height 100 µm, length 1600 µm and width 50 µm. The array period of the crystal is 500 µm. 2D photonic crystal has 5×5 cylindrical pillar array with an array period of 75 µm. Height and diameter of the pillar array are 160 µm and 100 µm respectively. Two samples considered in the work are blood and glucose solution, which are labelled as sample 1 and sample 2 respectively. The proposed sensor detects the concentration of glucose in the samples from 0 to 100 mg/dL. For this, the crystal was irradiated with 0.3 to 3 THz waves. By analyzing the obtained S parameter, the refractive index of the crystal corresponding to the particular concentration of glucose was measured using the parameter retrieval method. Refractive indices of the two crystals decreased gradually with the increase in concentration of glucose in the sample. For 1D photonic crystals, a gradual decrease in refractive index was observed at 1 THz. 2D photonic crystal showed this behavior at 2 THz. The proposed sensor was simulated using CST Microwave studio. This will enable us to develop a model which can be used to characterize a glucose sensor. The present study is expected to contribute to blood glucose monitoring.Keywords: CST microwave studio, glucose sensor, photonic crystal, terahertz waves
Procedia PDF Downloads 281743 Festival Gamification: Conceptualization and Scale Development
Authors: Liu Chyong-Ru, Wang Yao-Chin, Huang Wen-Shiung, Tang Wan-Ching
Abstract:
Although gamification has been concerned and applied in the tourism industry, limited literature could be found in tourism academy. Therefore, to contribute knowledge in festival gamification, it becomes essential to start by establishing a Festival Gamification Scale (FGS). This study defines festival gamification as the extent of a festival to involve game elements and game mechanisms. Based on self-determination theory, this study developed an FGS. Through the multi-study method, in study one, five FGS dimensions were sorted through literature review, followed by twelve in-depth interviews. A total of 296 statements were extracted from interviews and were later narrowed down to 33 items under six dimensions. In study two, 226 survey responses were collected from a cycling festival for exploratory factor analysis, resulting in twenty items under five dimensions. In study three, 253 survey responses were obtained from a marathon festival for confirmatory factor analysis, resulting in the final sixteen items under five dimensions. Then, results of criterion-related validity confirmed the positive effects of these five dimensions on flow experience. In study four, for examining the model extension of the developed five-dimensional 16-item FGS, which includes dimensions of relatedness, mastery, competence, fun, and narratives, cross-validation analysis was performed using 219 survey responses from a religious festival. For the tourism academy, the FGS could further be applied in other sub-fields such as destinations, theme parks, cruise trips, or resorts. The FGS serves as a starting point for examining the mechanism of festival gamification in changing tourists’ attitudes and behaviors. Future studies could work on follow-up studies of FGS by testing outcomes of festival gamification or examining moderating effects of enhancing outcomes of festival gamification. On the other hand, although the FGS has been tested in cycling, marathon, and religious festivals, the research settings are all in Taiwan. Cultural differences of FGS is another further direction for contributing knowledge in festival gamification. This study also contributes to several valuable practical implications. First, this FGS could be utilized in tourist surveys for evaluating the extent of gamification of a festival. Based on the results of the performance assessment by FGS, festival management organizations and festival planners could learn the relative scores among dimensions of FGS, and plan for future improvement of gamifying the festival. Second, the FGS could be applied in positioning a gamified festival. Festival management organizations and festival planners could firstly consider the features and types of their festival, and then gamify their festival based on investing resources in key FGS dimensions.Keywords: festival gamification, festival tourism, scale development, self-determination theory
Procedia PDF Downloads 147742 Analysis of the Evolution of Landscape Spatial Patterns in Banan District, Chongqing, China
Authors: Wenyang Wan
Abstract:
The study of urban land use and landscape pattern is the current hotspot in the fields of planning and design, ecology, etc., which is of great significance for the construction of the overall humanistic ecosystem of the city and optimization of the urban spatial structure. Banan District, as the main part of the eastern eco-city planning of Chongqing Municipality, is a high ground for highlighting the ecological characteristics of Chongqing, realizing effective transformation of ecological value, and promoting the integrated development of urban and rural areas. The analytical methods of land use transfer matrix (GIS) and landscape pattern index (Fragstats) were used to study the characteristics and laws of the evolution of land use landscape pattern in Banan District from 2000 to 2020, which provide some reference value for Banan District to alleviate the ecological contradiction of landscape. The results of the study show that ① Banan District is rich in land use types, of which the area of cultivated land will still account for 57.15% of the total area of the landscape until 2020, accounting for an absolute advantage in land use structure of Banan District; ② From 2000 to 2020, land use conversion in Banan District is characterized as Cropland > woodland > grassland > shrubland > built-up land > water bodies > wetlands, with cropland converted to built-up land being the largest; ③ From 2000 to 2020, the landscape elements of Banan District were distributed in a balanced way, and the landscape types were rich and diversified, but due to the influence of human interference, it also presented the characteristics that the shape of the landscape elements tended to be irregular, and the dominant patches were distributed in a scattered manner, and the patches had poor connectivity. It is recommended that in future regional ecological construction, the layout should be rationally optimized, the relationship between landscape components should be coordinated, the connectivity between landscape patches should be strengthened, and the degree of landscape fragmentation should be reduced.Keywords: land use transfer, landscape pattern evolution, GIS and Fragstats, Banan district
Procedia PDF Downloads 73741 Using Geographic Information Systems Techniques and Multi-Source Earth Observation Data to Study the Trends of Urban Expansion in Welayat Barka Sultanate of Oman during the Period from 2002 to 2019
Authors: Eyad H. R. Fadda, Jawaher K. Al Rashdieah, Aysha H. Al Rashdieh
Abstract:
Urban Sprawl is a phenomenon that many regions in the Sultanate of Oman suffer from in general and in Welayat Barka in particular. It is considered a human phenomenon that causes many negative effects as it has increased in the last time clearly, and this study aims to diagnose the current status of urban growth taking place in Walayat Barka. The objective of this study is to monitor and follow up on the most prominent changes and developments taking place in Barka in the period from 2002 to 2019 and provide suggestions to the decision-makers to reduce the negative effects of the phenomenon. The study methodology depends on the descriptive and analytical approach to describe the phenomenon and its analysis and knowledge of the factors that helped in urban expansion in the Barka, using a number of studies and interviews with the specialists, both in governmental and private institutions, as well as with individuals who own land, real estate, and others. Geographic Information Systems (GIS) and Remote Sensing (ERDAS software) have been used to analyze the satellite images that helped in obtaining results that reflect the changes Barka, in addition to knowing the natural and human determinants that stand on Urban Sprawl Expansion. The study concluded that the geographical location of Barka has a significant role in its urban expansion, as it is the closest state to the capital Muscat, as this expansion continues toward the southern and south-western directions, as this expansion has significant negative effects represented in the low number of agricultural lands due to the continuous change in land use. In addition, it was found that there are two types of natural determinants of urban expansion in Barka, which are consumed land from the Sea of Oman and from the western sands.Keywords: GIS applications, remote sensing, urbanization, urban sprawl expansion trends
Procedia PDF Downloads 111740 Rapid Soil Classification Using Computer Vision, Electrical Resistivity and Soil Strength
Authors: Eugene Y. J. Aw, J. W. Koh, S. H. Chew, K. E. Chua, Lionel L. J. Ang, Algernon C. S. Hong, Danette S. E. Tan, Grace H. B. Foo, K. Q. Hong, L. M. Cheng, M. L. Leong
Abstract:
This paper presents a novel rapid soil classification technique that combines computer vision with four-probe soil electrical resistivity method and cone penetration test (CPT), to improve the accuracy and productivity of on-site classification of excavated soil. In Singapore, excavated soils from local construction projects are transported to Staging Grounds (SGs) to be reused as fill material for land reclamation. Excavated soils are mainly categorized into two groups (“Good Earth” and “Soft Clay”) based on particle size distribution (PSD) and water content (w) from soil investigation reports and on-site visual survey, such that proper treatment and usage can be exercised. However, this process is time-consuming and labour-intensive. Thus, a rapid classification method is needed at the SGs. Computer vision, four-probe soil electrical resistivity and CPT were combined into an innovative non-destructive and instantaneous classification method for this purpose. The computer vision technique comprises soil image acquisition using industrial grade camera; image processing and analysis via calculation of Grey Level Co-occurrence Matrix (GLCM) textural parameters; and decision-making using an Artificial Neural Network (ANN). Complementing the computer vision technique, the apparent electrical resistivity of soil (ρ) is measured using a set of four probes arranged in Wenner’s array. It was found from the previous study that the ANN model coupled with ρ can classify soils into “Good Earth” and “Soft Clay” in less than a minute, with an accuracy of 85% based on selected representative soil images. To further improve the technique, the soil strength is measured using a modified mini cone penetrometer, and w is measured using a set of time-domain reflectometry (TDR) probes. Laboratory proof-of-concept was conducted through a series of seven tests with three types of soils – “Good Earth”, “Soft Clay” and an even mix of the two. Validation was performed against the PSD and w of each soil type obtained from conventional laboratory tests. The results show that ρ, w and CPT measurements can be collectively analyzed to classify soils into “Good Earth” or “Soft Clay”. It is also found that these parameters can be integrated with the computer vision technique on-site to complete the rapid soil classification in less than three minutes.Keywords: Computer vision technique, cone penetration test, electrical resistivity, rapid and non-destructive, soil classification
Procedia PDF Downloads 219739 Structural Challenges of Social Integration of Immigrants in Iran: Investigating the Status of Providing Citizenship and Social Services
Authors: Iman Shabanzadeh
Abstract:
In terms of its geopolitical position, Iran has been one of the main centers of migration movements in the world in recent decades. However, the policy makers' lack of preparation in completing the cycle of social integration of these immigrants, especially the second and third generation, has caused these people to always be prone to leave the country and immigrate to developed and industrialized countries. In this research, the issue of integration of immigrants in Iran from the perspective of four indicators, "Identity Documents", "Access to Banking Services", "Access to Health and Treatment Services" and "Obtaining a Driver's License" will be analyzed. The research method is descriptive-analytical. To collect information, library and document sources in the field of laws and regulations related to immigrants' rights in Iran, semi-structured interviews with experts have been used. The investigations of this study show that none of the residence documents of immigrants in Iran guarantee the full enjoyment of basic citizenship rights for them. In fact, the function of many of these identity documents, such as the census card, educational support card, etc., is only to prevent crossing the border, and none of them guarantee the basic rights of citizenship. Therefore, for many immigrants, the difference between legality and illegality is only in the risk of crossing the border, and this has led to the spread of the habit of illegal presence for them. Despite this, it seems that there is no clear and coherent policy framework around the issue of foreign immigrants in the country. This policy incoherence can be clearly seen in the diversity and plurality of identity and legal documents of the citizens present in the country and the policy maker's lack of planning to integrate and organize the identity of this huge group. Examining the differences and socioeconomic inequalities between immigrants and the native Iranian population shows that immigrants have been poorly integrated into the structures of Iranian society from an economic and social point of view.Keywords: immigrants, social integration, citizen services, structural inequality
Procedia PDF Downloads 44738 Dislocation Density-Based Modeling of the Grain Refinement in Surface Mechanical Attrition Treatment
Authors: Reza Miresmaeili, Asghar Heydari Astaraee, Fereshteh Dolati
Abstract:
In the present study, an analytical model based on dislocation density model was developed to simulate grain refinement in surface mechanical attrition treatment (SMAT). The correlation between SMAT time and development in plastic strain on one hand, and dislocation density evolution, on the other hand, was established to simulate the grain refinement in SMAT. A dislocation density-based constitutive material law was implemented using VUHARD subroutine. A random sequence of shots is taken into consideration for multiple impacts model using Python programming language by utilizing a random function. The simulation technique was to model each impact in a separate run and then transferring the results of each run as initial conditions for the next run (impact). The developed Finite Element (FE) model of multiple impacts describes the coverage evolution in SMAT. Simulations were run to coverage levels as high as 4500%. It is shown that the coverage implemented in the FE model is equal to the experimental coverage. It is depicted that numerical SMAT coverage parameter is adequately conforming to the well-known Avrami model. Comparison between numerical results and experimental measurements for residual stresses and depth of deformation layers confirms the performance of the established FE model for surface engineering evaluations in SMA treatment. X-ray diffraction (XRD) studies of grain refinement, including resultant grain size and dislocation density, were conducted to validate the established model. The full width at half-maximum in XRD profiles can be used to measure the grain size. Numerical results and experimental measurements of grain refinement illustrate good agreement and show the capability of established FE model to predict the gradient microstructure in SMA treatment.Keywords: dislocation density, grain refinement, severe plastic deformation, simulation, surface mechanical attrition treatment
Procedia PDF Downloads 136737 A Study on Inverse Determination of Impact Force on a Honeycomb Composite Panel
Authors: Hamed Kalhori, Lin Ye
Abstract:
In this study, an inverse method was developed to reconstruct the magnitude and duration of impact forces exerted to a rectangular carbon fibre-epoxy composite honeycomb sandwich panel. The dynamic signals captured by Piezoelectric (PZT) sensors installed on the panel remotely from the impact locations were utilized to reconstruct the impact force generated by an instrumented hammer through an extended deconvolution approach. Two discretized forms of convolution integral are considered; the traditional one with an explicit transfer function and the modified one without an explicit transfer function. Deconvolution, usually applied to reconstruct the time history (e.g. magnitude) of a stochastic force at a defined location, is extended to identify both the location and magnitude of the impact force among a number of potential impact locations. It is assumed that a number of impact forces are simultaneously exerted to all potential locations, but the magnitude of all forces except one is zero, implicating that the impact occurs only at one location. The extended deconvolution is then applied to determine the magnitude as well as location (among the potential ones), incorporating the linear superposition of responses resulted from impact at each potential location. The problem can be categorized into under-determined (the number of sensors is less than that of impact locations), even-determined (the number of sensors equals that of impact locations), or over-determined (the number of sensors is greater than that of impact locations) cases. For an under-determined case, it comprises three potential impact locations and one PZT sensor for the rectangular carbon fibre-epoxy composite honeycomb sandwich panel. Assessments are conducted to evaluate the factors affecting the precision of the reconstructed force. Truncated Singular Value Decomposition (TSVD) and the Tikhonov regularization are independently chosen to regularize the problem to find the most suitable method for this system. The selection of optimal value of the regularization parameter is investigated through L-curve and Generalized Cross Validation (GCV) methods. In addition, the effect of different width of signal windows on the reconstructed force is examined. It is observed that the impact force generated by the instrumented impact hammer is sensitive to the impact locations of the structure, having a shape from a simple half-sine to a complicated one. The accuracy of the reconstructed impact force is evaluated using the correlation co-efficient between the reconstructed force and the actual one. Based on this criterion, it is concluded that the forces reconstructed by using the extended deconvolution without an explicit transfer function together with Tikhonov regularization match well with the actual forces in terms of magnitude and duration.Keywords: honeycomb composite panel, deconvolution, impact localization, force reconstruction
Procedia PDF Downloads 535736 Simulation Study on Polymer Flooding with Thermal Degradation in Elevated-Temperature Reservoirs
Authors: Lin Zhao, Hanqiao Jiang, Junjian Li
Abstract:
Polymers injected into elevated-temperature reservoirs inevitably suffer from thermal degradation, resulting in severe viscosity loss and poor flooding performance. However, for polymer flooding in such reservoirs, present simulators fail to provide accurate results for lack of description on thermal degradation. In light of this, the objectives of this paper are to provide a simulation model for polymer flooding with thermal degradation and study the effect of thermal degradation on polymer flooding in elevated-temperature reservoirs. Firstly, a thermal degradation experiment was conducted to obtain the degradation law of polymer concentration and viscosity. Different types of polymers degraded in the Thermo tank with elevated temperatures. Afterward, based on the obtained law, a streamline-assistant model was proposed to simulate the degradation process under in-situ flow conditions. Model validation was performed with field data from a well group of an offshore oilfield. Finally, the effect of thermal degradation on polymer flooding was studied using the proposed model. Experimental results showed that the polymer concentration remained unchanged, while the viscosity degraded exponentially with time after degradation. The polymer viscosity was functionally dependent on the polymer degradation time (PDT), which represented the elapsed time started from the polymer particle injection. Tracing the real flow path of polymer particle was required. Therefore, the presented simulation model was streamline-assistant. Equation of PDT vs. time of flight (TOF) along streamline was built by the law of polymer particle transport. Based on the field polymer sample and dynamic data, the new model proved its accuracy. Study of degradation effect on polymer flooding indicated: (1) the viscosity loss increased with TOF exponentially in the main body of polymer-slug and remained constant in the slug front; (2) the responding time of polymer flooding was delayed, but the effective time was prolonged; (3) the breakthrough of subsequent water was eased; (4) the capacity of polymer adjusting injection profile was diminished; (5) the incremental recovery was reduced significantly. In general, the effect of thermal degradation on polymer flooding performance was rather negative. This paper provides a more comprehensive insight into polymer thermal degradation in both the physical process and field application. The proposed simulation model offers an effective means for simulating the polymer flooding process with thermal degradation. The negative effect of thermal degradation suggests that the polymer thermal stability should be given full consideration when designing polymer flooding project in elevated-temperature reservoirs.Keywords: polymer flooding, elevated-temperature reservoir, thermal degradation, numerical simulation
Procedia PDF Downloads 143735 Assessment of the Effect of Ethanolic Leaf Extract of Annona squamosa L. on Den Induced Hepatocellular Carcinoma in Experimental Animals
Authors: Vanitha Varadharaj, Vijalakshmi Krishnamurthy
Abstract:
Annona squamosa Linn, commonly known as Sugar apple, belonging to the family Annonaceae, is said to show varied medicinal effects, including insecticide, antiovulatory and abortifacient. The alkaloid and flavonoids present in Annona squamosa leaf has proved to have antioxidant activity. The present work has been planned to investigate the effect of ethanolic leaf extract of Annona squamosa leaf on Den Induced wistar albino rats. The study was carried out to analyze the biochemical Parmeters like Total Proteins, Bilirubin, Enzymatic and Non –Enzymatic enzymes, Marker enzymes and Tumor markers in serum and also the histopathological studies in liver is carried out in control and DEN induced rats. Supplementation of ELAS (Ethanolic Leaf Extract Of Annona squamosa) reduced the liver weight and also reduced the tumour incidence. Chemoprevention group showed near normal values of bilirubin when compared with the control rats. Total protein was decreased in the cancer bearing group and on treatment with the extract the levels of protein were restored. Both in pre and post treatment group, the activities of enzymatic antioxidants such as superoxide dismutase, catalase, and Glutathione peroxidase were increased but in pre treated animals it was more effective than post treated animals. The non- enzymatic antioxidants such as vitamin C and vitamin E were brought back to normal level significantly in post and pre treated animals. Activities of marker enzymes such as SGOT, SGPT, ALP, γ GT were significantly elevated in the serum of cancer animals and the values returned to normal after treatment with the extract suggesting the hepato protective effect of the extract. Lipid peroxide was found to be elevated in the cancer induced group. This condition was brought back to the normal in the pre and post treated animals with ELAS. Histological examination also confirmed the anti- carcinogenic potential of ELAS, Cancer induced groups had a triple fold increase in their AFP values when compared to other groups. DEN treatment increased the level of AFP expression while ELAS partially counteracted the effect of it. So the scientific validation obtained from this study may pave way to many budding scientists to find new drugs from Annona squamosa for various ailments.Keywords: annona squamosa, biochemical parmeters, cancer, leaf extract
Procedia PDF Downloads 331734 Global Learning Supports Global Readiness with Projects with Purpose
Authors: Brian Bilich
Abstract:
A typical global learning program is a two-week project based, culturally immersive and academically relevant experience built around a project with purpose and catered to student and business groups. Global Learning in Continuing Education at Austin Community College promotes global readiness through projects with purpose with special attention given to balancing learning, hospitality and travel. A recent project involved CommunityFirst! Village; a 51-acre planned community which provides affordable, permanent housing for men and women coming out of chronic homelessness. Global Learning students collaborated with residents and staff at the Community First! Village on a project to produce two-dimensional remodeling plans of residents’ tiny homes with a focus on but not limited to design improvements on elements related to accessibility, increased usability of living and storage space and esthetic upgrades to boost psychological and emotional appeal. The goal of project-based learning in the context of global learning in Continuing Educaiton at Austin Community Collegen general is two fold. One, in rapid fashion we develop a project which gives the learner a hands-on opportunity to exercise soft and technical skills, like creativity and communication and analytical thinking. Two, by basing projects on global social conflict issues, the project of purpose promotes the development of empathy for other people and fosters a sense of corporate social responsibility in future generations of business leadership. In the example provide above the project informed the student group on the topic of chronic homelessness and promoted awareness and empathy for this underserved segment of the community. Project-based global learning based on projects with purpose has the potential to cultivate global readiness by developing empathy and strengthening emotional intelligence for future generations.Keywords: project-based learning, global learning, global readiness, globalization, international exchange, collaboration
Procedia PDF Downloads 64733 The Use of TRIZ to Map the Evolutive Pattern of Products
Authors: Fernando C. Labouriau, Ricardo M. Naveiro
Abstract:
This paper presents a model for mapping the evolutive pattern of products in order to generate new ideas, to perceive emerging technologies and to manage product’s portfolios in new product development (NPD). According to the proposed model, the information extracted from the patent system is filtered and analyzed with TRIZ tools to produce the input information to the NPD process. The authors acknowledge that the NPD process is well integrated within the enterprises business strategic planning and that new products are vital in the competitive market nowadays. In the other hand, it has been observed the proactive use of patent information in some methodologies for selecting projects, mapping technological change and generating product concepts. And one of these methodologies is TRIZ, a theory created to favor innovation and to improve product design that provided the analytical framework for the model. Initially, it is presented an introduction to TRIZ mainly focused on the patterns of evolution of technical systems and its strategic uses, a brief and absolutely non-comprehensive description as the theory has several others tools being widely employed in technical and business applications. Then, it is introduced the model for mapping the products evolutive pattern with its three basic pillars, namely patent information, TRIZ and NPD, and the methodology for implementation. Following, a case study of a Brazilian bike manufacturing is presented to proceed the mapping of a product evolutive pattern by decomposing and analyzing one of its assemblies along ten evolution lines in order to envision opportunities for further product development. Some of these lines are illustrated in more details to evaluate the features of the product in relation to the TRIZ concepts using a comparison perspective with patents in the state of the art to validate the product’s evolutionary potential. As a result, the case study provided several opportunities for a product improvement development program in different project categories, identifying technical and business impacts as well as indicating the lines of evolution that can mostly benefit from each opportunity.Keywords: product development, patents, product strategy, systems evolution
Procedia PDF Downloads 501732 Micromechanical Modelling of Ductile Damage with a Cohesive-Volumetric Approach
Authors: Noe Brice Nkoumbou Kaptchouang, Pierre-Guy Vincent, Yann Monerie
Abstract:
The present work addresses the modelling and the simulation of crack initiation and propagation in ductile materials which failed by void nucleation, growth, and coalescence. One of the current research frameworks on crack propagation is the use of cohesive-volumetric approach where the crack growth is modelled as a decohesion of two surfaces in a continuum material. In this framework, the material behavior is characterized by two constitutive relations, the volumetric constitutive law relating stress and strain, and a traction-separation law across a two-dimensional surface embedded in the three-dimensional continuum. Several cohesive models have been proposed for the simulation of crack growth in brittle materials. On the other hand, the application of cohesive models in modelling crack growth in ductile material is still a relatively open field. One idea developed in the literature is to identify the traction separation for ductile material based on the behavior of a continuously-deforming unit cell failing by void growth and coalescence. Following this method, the present study proposed a semi-analytical cohesive model for ductile material based on a micromechanical approach. The strain localization band prior to ductile failure is modelled as a cohesive band, and the Gurson-Tvergaard-Needleman plasticity model (GTN) is used to model the behavior of the cohesive band and derived a corresponding traction separation law. The numerical implementation of the model is realized using the non-smooth contact method (NSCD) where cohesive models are introduced as mixed boundary conditions between each volumetric finite element. The present approach is applied to the simulation of crack growth in nuclear ferritic steel. The model provides an alternative way to simulate crack propagation using the numerical efficiency of cohesive model with a traction separation law directly derived from porous continuous model.Keywords: ductile failure, cohesive model, GTN model, numerical simulation
Procedia PDF Downloads 149731 A Robust Optimization of Chassis Durability/Comfort Compromise Using Chebyshev Polynomial Chaos Expansion Method
Authors: Hanwei Gao, Louis Jezequel, Eric Cabrol, Bernard Vitry
Abstract:
The chassis system is composed of complex elements that take up all the loads from the tire-ground contact area and thus it plays an important role in numerous specifications such as durability, comfort, crash, etc. During the development of new vehicle projects in Renault, durability validation is always the main focus while deployment of comfort comes later in the project. Therefore, sometimes design choices have to be reconsidered because of the natural incompatibility between these two specifications. Besides, robustness is also an important point of concern as it is related to manufacturing costs as well as the performance after the ageing of components like shock absorbers. In this paper an approach is proposed aiming to realize a multi-objective optimization between chassis endurance and comfort while taking the random factors into consideration. The adaptive-sparse polynomial chaos expansion method (PCE) with Chebyshev polynomial series has been applied to predict responses’ uncertainty intervals of a system according to its uncertain-but-bounded parameters. The approach can be divided into three steps. First an initial design of experiments is realized to build the response surfaces which represent statistically a black-box system. Secondly within several iterations an optimum set is proposed and validated which will form a Pareto front. At the same time the robustness of each response, served as additional objectives, is calculated from the pre-defined parameter intervals and the response surfaces obtained in the first step. Finally an inverse strategy is carried out to determine the parameters’ tolerance combination with a maximally acceptable degradation of the responses in terms of manufacturing costs. A quarter car model has been tested as an example by applying the road excitations from the actual road measurements for both endurance and comfort calculations. One indicator based on the Basquin’s law is defined to compare the global chassis durability of different parameter settings. Another indicator related to comfort is obtained from the vertical acceleration of the sprung mass. An optimum set with best robustness has been finally obtained and the reference tests prove a good robustness prediction of Chebyshev PCE method. This example demonstrates the effectiveness and reliability of the approach, in particular its ability to save computational costs for a complex system.Keywords: chassis durability, Chebyshev polynomials, multi-objective optimization, polynomial chaos expansion, ride comfort, robust design
Procedia PDF Downloads 152730 Spectrogram Pre-Processing to Improve Isotopic Identification to Discriminate Gamma and Neutrons Sources
Authors: Mustafa Alhamdi
Abstract:
Industrial application to classify gamma rays and neutron events is investigated in this study using deep machine learning. The identification using a convolutional neural network and recursive neural network showed a significant improvement in predication accuracy in a variety of applications. The ability to identify the isotope type and activity from spectral information depends on feature extraction methods, followed by classification. The features extracted from the spectrum profiles try to find patterns and relationships to present the actual spectrum energy in low dimensional space. Increasing the level of separation between classes in feature space improves the possibility to enhance classification accuracy. The nonlinear nature to extract features by neural network contains a variety of transformation and mathematical optimization, while principal component analysis depends on linear transformations to extract features and subsequently improve the classification accuracy. In this paper, the isotope spectrum information has been preprocessed by finding the frequencies components relative to time and using them as a training dataset. Fourier transform implementation to extract frequencies component has been optimized by a suitable windowing function. Training and validation samples of different isotope profiles interacted with CdTe crystal have been simulated using Geant4. The readout electronic noise has been simulated by optimizing the mean and variance of normal distribution. Ensemble learning by combing voting of many models managed to improve the classification accuracy of neural networks. The ability to discriminate gamma and neutron events in a single predication approach using deep machine learning has shown high accuracy using deep learning. The paper findings show the ability to improve the classification accuracy by applying the spectrogram preprocessing stage to the gamma and neutron spectrums of different isotopes. Tuning deep machine learning models by hyperparameter optimization of neural network models enhanced the separation in the latent space and provided the ability to extend the number of detected isotopes in the training database. Ensemble learning contributed significantly to improve the final prediction.Keywords: machine learning, nuclear physics, Monte Carlo simulation, noise estimation, feature extraction, classification
Procedia PDF Downloads 150729 Main Tendencies of Youth Unemployment and the Regulation Mechanisms for Decreasing Its Rate in Georgia
Authors: Nino Paresashvili, Nino Abesadze
Abstract:
The modern world faces huge challenges. Globalization changed the socio-economic conditions of many countries. The current processes in the global environment have a different impact on countries with different cultures. However, an alleviation of poverty and improvement of living conditions is still the basic challenge for the majority of countries, because much of the population still lives under the official threshold of poverty. It is very important to stimulate youth employment. In order to prepare young people for the labour market, it is essential to provide them with the appropriate professional skills and knowledge. It is necessary to plan efficient activities for decreasing an unemployment rate and for developing the perfect mechanisms for regulation of a labour market. Such planning requires thorough study and analysis of existing reality, as well as development of corresponding mechanisms. Statistical analysis of unemployment is one of the main platforms for regulation of the labour market key mechanisms. The corresponding statistical methods should be used in the study process. Such methods are observation, gathering, grouping, and calculation of the generalized indicators. Unemployment is one of the most severe socioeconomic problems in Georgia. According to the past as well as the current statistics, unemployment rates always have been the most problematic issue to resolve for policy makers. Analytical works towards to the above-mentioned problem will be the basis for the next sustainable steps to solve the main problem. The results of the study showed that the choice of young people is not often due to their inclinations, their interests and the labour market demand. That is why the wrong professional orientation of young people in most cases leads to their unemployment. At the same time, it was shown that there are a number of professions in the labour market with a high demand because of the deficit the appropriate specialties. To achieve healthy competitiveness in youth employment, it is necessary to formulate regional employment programs with taking into account the regional infrastructure specifications.Keywords: unemployment, analysis, methods, tendencies, regulation mechanisms
Procedia PDF Downloads 378728 Investigating the Relationship Between Alexithymia and Mobile Phone Addiction Along with the Mediating Role of Anxiety, Stress and Depression: A Path Analysis Study and Structural Model Testing
Authors: Pouriya Darabiyan, Hadis Nazari, Kourosh Zarea, Saeed Ghanbari, Zeinab Raiesifar, Morteza Khafaie, Hanna Tuvesson
Abstract:
Introduction Since the beginning of mobile phone addiction, alexithymia, depression, anxiety and stress have been stated as risk factors for Internet addiction, so this study was conducted with the aim of investigating the relationship between Alexithymia and Mobile phone addiction along with the mediating role of anxiety, stress and depression. Materials and methods In this descriptive-analytical and cross-sectional study in 2022, 412 students School of Nursing & Midwifery of Ahvaz Jundishapur University of Medical Sciences were included in the study using available sampling method. Data collection tools were: Demographic Information Questionnaire, Toronto Alexithymia Scale (TAS-20), Depression, Anxiety, Stress Scale (DASS-21) and Mobile Phone Addiction Index (MPAI). Frequency, Pearson correlation coefficient test and linear regression were used to describe and analyze the data. Also, structural equation models and path analysis method were used to investigate the direct and indirect effects as well as the total effect of each dimension of Alexithymia on Mobile phone addiction with the mediating role of stress, depression and anxiety. Statistical analysis was done by SPSS version 22 and Amos version 16 software. Results Alexithymia was a predictive factor for mobile phone addiction. Also, Alexithymia had a positive and significant effect on depression, anxiety and stress. Depression, anxiety and stress had a positive and significant effect on mobile phone addiction. Depression, anxiety and stress variables played the role of a relative mediating variable between Alexithymia and mobile phone addiction. Alexithymia through depression, anxiety and stress also has an indirect effect on Internet addiction. Conclusion Alexithymia is a predictive factor for mobile phone addiction; And the variables of depression, anxiety and stress play the role of a relative mediating variable between Alexithymia and mobile phone addiction.Keywords: alexithymia, mobile phone, depression, anxiety, stress
Procedia PDF Downloads 99727 Impact of Combined Heat and Power (CHP) Generation Technology on Distribution Network Development
Authors: Sreto Boljevic
Abstract:
In the absence of considerable investment in electricity generation, transmission and distribution network (DN) capacity, the demand for electrical energy will quickly strain the capacity of the existing electrical power network. With anticipated growth and proliferation of Electric vehicles (EVs) and Heat pump (HPs) identified the likelihood that the additional load from EV changing and the HPs operation will require capital investment in the DN. While an area-wide implementation of EVs and HPs will contribute to the decarbonization of the energy system, they represent new challenges for the existing low-voltage (LV) network. Distributed energy resources (DER), operating both as part of the DN and in the off-network mode, have been offered as a means to meet growing electricity demand while maintaining and ever-improving DN reliability, resiliency and power quality. DN planning has traditionally been done by forecasting future growth in demand and estimating peak load that the network should meet. However, new problems are arising. These problems are associated with a high degree of proliferation of EVs and HPs as load imposes on DN. In addition to that, the promotion of electricity generation from renewable energy sources (RES). High distributed generation (DG) penetration and a large increase in load proliferation at low-voltage DNs may have numerous impacts on DNs that create issues that include energy losses, voltage control, fault levels, reliability, resiliency and power quality. To mitigate negative impacts and at a same time enhance positive impacts regarding the new operational state of DN, CHP system integration can be seen as best action to postpone/reduce capital investment needed to facilitate promotion and maximize benefits of EVs, HPs and RES integration in low-voltage DN. The aim of this paper is to generate an algorithm by using an analytical approach. Algorithm implementation will provide a way for optimal placement of the CHP system in the DN in order to maximize the integration of RES and increase in proliferation of EVs and HPs.Keywords: combined heat & power (CHP), distribution networks, EVs, HPs, RES
Procedia PDF Downloads 202726 The Structural Behavior of Fiber Reinforced Lightweight Concrete Beams: An Analytical Approach
Authors: Jubee Varghese, Pouria Hafiz
Abstract:
Increased use of lightweight concrete in the construction industry is mainly due to its reduction in the weight of the structural elements, which in turn reduces the cost of production, transportation, and the overall project cost. However, the structural application of these lightweight concrete structures is limited due to its reduced density. Hence, further investigations are in progress to study the effect of fiber inclusion in improving the mechanical properties of lightweight concrete. Incorporating structural steel fibers, in general, enhances the performance of concrete and increases its durability by minimizing its potential to cracking and providing crack arresting mechanism. In this research, Geometric and Materially Non-linear Analysis (GMNA) was conducted for Finite Element Modelling using a software known as ABAQUS, to investigate the structural behavior of lightweight concrete with and without the addition of steel fibers and shear reinforcement. 21 finite element models of beams were created to study the effect of steel fibers based on three main parameters; fiber volume fraction (Vf = 0, 0.5 and 0.75%), shear span to depth ratio (a/d of 2, 3 and 4) and ratio of area of shear stirrups to spacing (As/s of 0.7, 1 and 1.6). The models created were validated with the previous experiment conducted by H.K. Kang et al. in 2011. It was seen that the lightweight fiber reinforcement can replace the use of fiber reinforced normal weight concrete as structural elements. The effect of an increase in steel fiber volume fraction is dominant for beams with higher shear span to depth ratio than for lower ratios. The effect of stirrups in the presence of fibers was very negligible; however; it provided extra confinement to the cracks by reducing the crack propagation and extra shear resistance than when compared to beams with no stirrups.Keywords: ABAQUS, beams, fiber-reinforced concrete, finite element, light weight, shear span-depth ratio, steel fibers, steel-fiber volume fraction
Procedia PDF Downloads 107725 Optimization of Traffic Agent Allocation for Minimizing Bus Rapid Transit Cost on Simplified Jakarta Network
Authors: Gloria Patricia Manurung
Abstract:
Jakarta Bus Rapid Transit (BRT) system which was established in 2009 to reduce private vehicle usage and ease the rush hour gridlock throughout the Jakarta Greater area, has failed to achieve its purpose. With gradually increasing the number of private vehicles ownership and reduced road space by the BRT lane construction, private vehicle users intuitively invade the exclusive lane of BRT, creating local traffic along the BRT network. Invaded BRT lanes costs become the same with the road network, making BRT which is supposed to be the main public transportation in the city becoming unreliable. Efforts to guard critical lanes with preventing the invasion by allocating traffic agents at several intersections have been expended, lead to the improving congestion level along the lane. Given a set of number of traffic agents, this study uses an analytical approach to finding the best deployment strategy of traffic agent on a simplified Jakarta road network in minimizing the BRT link cost which is expected to lead to the improvement of BRT system time reliability. User-equilibrium model of traffic assignment is used to reproduce the origin-destination demand flow on the network and the optimum solution conventionally can be obtained with brute force algorithm. This method’s main constraint is that traffic assignment simulation time escalates exponentially with the increase of set of agent’s number and network size. Our proposed metaheuristic and heuristic algorithms perform linear simulation time increase and result in minimized BRT cost approaching to brute force algorithm optimization. Further analysis of the overall network link cost should be performed to see the impact of traffic agent deployment to the network system.Keywords: traffic assignment, user equilibrium, greedy algorithm, optimization
Procedia PDF Downloads 229724 Antibacterial Effects of Garcinia mangostana on Canine Superficial Pyoderma Pathogen, Staphylococcus pseudintermedius
Authors: Sineenat Kempubpha, Phornpa-Ngan Muadmuang, Putthamas Phetmuangprab, Surin Promphet, Sopita Bandit
Abstract:
Introduction: Discarded pericarp of mangosteen (Garcinia mangostana) is a benefit to be developed as veterinary phytopharmacal products since it made up of abundance pharmacological active compounds. The active compounds of mangosteen pericarp not only act as an antihistamine, an anti-inflammatory, heart disease and HIV therapeutic substances but also act as antibacterial and antifungal agents. Aim: This study was an in vitro procedural attempt to determine the antibacterial effects of mangosteen pericarp 95% ethanol extract on the main causative pathogen of canine superficial pyoderma, Staphylococcus pseudintermedius. Methods: S. pseudintermedius were collected from various sites of the skin of canine superficial pyoderma dogs and were revived and lawn cultured. The S. pseudintermedius growth inhibition study was determined by disc diffusion technique, the mangosteen pericarp crude extracted was dissolved in 3 types of solvents (95% ethanol, 2% DMSO and distilled water, respectively). The micro broth dilution technique was used for determining both minimum inhibitory concentration (MIC) and minimum bactericidal concentration (MBC) values. Statistical analysis was done by calculating the mean of the zones of inhibition of tested microorganisms. Results: S. pseudintermedius growth inhibition study showed that the inhibition efficacy of 95% ethanol was greater than the inhibition efficacy of 2% DMSO and distilled water (9.10±0.18 mm, 6.95±0.60 mm and 6.80±0.18 mm, respectively). The MIC value was 125 µg/ml and the MBC value was 1 mg/ml. Conclusion: Mangosteen pericarp extract dissolved with 95% ethanol showed the highest zone of inhibition against the tested microorganisms. The MIC value was 125 µg/ml and the MBC value was 1 mg/ml which suggests its potent antibacterial action against S. pseudintermedius. However, further analytical studies are needed to isolate the key molecules of mangosteen pericarp for higher effect on canine superficial pyoderma microorganism therapeutic products.Keywords: mangosteen, Garcinia mangostana, Staphylococcus pseudintermedius, canine superficial pyoderma, in vitro study
Procedia PDF Downloads 280723 Systematic Evaluation of Convolutional Neural Network on Land Cover Classification from Remotely Sensed Images
Authors: Eiman Kattan, Hong Wei
Abstract:
In using Convolutional Neural Network (CNN) for classification, there is a set of hyperparameters available for the configuration purpose. This study aims to evaluate the impact of a range of parameters in CNN architecture i.e. AlexNet on land cover classification based on four remotely sensed datasets. The evaluation tests the influence of a set of hyperparameters on the classification performance. The parameters concerned are epoch values, batch size, and convolutional filter size against input image size. Thus, a set of experiments were conducted to specify the effectiveness of the selected parameters using two implementing approaches, named pertained and fine-tuned. We first explore the number of epochs under several selected batch size values (32, 64, 128 and 200). The impact of kernel size of convolutional filters (1, 3, 5, 7, 10, 15, 20, 25 and 30) was evaluated against the image size under testing (64, 96, 128, 180 and 224), which gave us insight of the relationship between the size of convolutional filters and image size. To generalise the validation, four remote sensing datasets, AID, RSD, UCMerced and RSCCN, which have different land covers and are publicly available, were used in the experiments. These datasets have a wide diversity of input data, such as number of classes, amount of labelled data, and texture patterns. A specifically designed interactive deep learning GPU training platform for image classification (Nvidia Digit) was employed in the experiments. It has shown efficiency in both training and testing. The results have shown that increasing the number of epochs leads to a higher accuracy rate, as expected. However, the convergence state is highly related to datasets. For the batch size evaluation, it has shown that a larger batch size slightly decreases the classification accuracy compared to a small batch size. For example, selecting the value 32 as the batch size on the RSCCN dataset achieves the accuracy rate of 90.34 % at the 11th epoch while decreasing the epoch value to one makes the accuracy rate drop to 74%. On the other extreme, setting an increased value of batch size to 200 decreases the accuracy rate at the 11th epoch is 86.5%, and 63% when using one epoch only. On the other hand, selecting the kernel size is loosely related to data set. From a practical point of view, the filter size 20 produces 70.4286%. The last performed image size experiment shows a dependency in the accuracy improvement. However, an expensive performance gain had been noticed. The represented conclusion opens the opportunities toward a better classification performance in various applications such as planetary remote sensing.Keywords: CNNs, hyperparamters, remote sensing, land cover, land use
Procedia PDF Downloads 169722 Radiomics: Approach to Enable Early Diagnosis of Non-Specific Breast Nodules in Contrast-Enhanced Magnetic Resonance Imaging
Authors: N. D'Amico, E. Grossi, B. Colombo, F. Rigiroli, M. Buscema, D. Fazzini, G. Cornalba, S. Papa
Abstract:
Purpose: To characterize, through a radiomic approach, the nature of nodules considered non-specific by expert radiologists, recognized in magnetic resonance mammography (MRm) with T1-weighted (T1w) sequences with paramagnetic contrast. Material and Methods: 47 cases out of 1200 undergoing MRm, in which the MRm assessment gave uncertain classification (non-specific nodules), were admitted to the study. The clinical outcome of the non-specific nodules was later found through follow-up or further exams (biopsy), finding 35 benign and 12 malignant. All MR Images were acquired at 1.5T, a first basal T1w sequence and then four T1w acquisitions after the paramagnetic contrast injection. After a manual segmentation of the lesions, done by a radiologist, and the extraction of 150 radiomic features (30 features per 5 subsequent times) a machine learning (ML) approach was used. An evolutionary algorithm (TWIST system based on KNN algorithm) was used to subdivide the dataset into training and validation test and to select features yielding the maximal amount of information. After this pre-processing, different machine learning systems were applied to develop a predictive model based on a training-testing crossover procedure. 10 cases with a benign nodule (follow-up older than 5 years) and 18 with an evident malignant tumor (clear malignant histological exam) were added to the dataset in order to allow the ML system to better learn from data. Results: NaiveBayes algorithm working on 79 features selected by a TWIST system, resulted to be the best performing ML system with a sensitivity of 96% and a specificity of 78% and a global accuracy of 87% (average values of two training-testing procedures ab-ba). The results showed that in the subset of 47 non-specific nodules, the algorithm predicted the outcome of 45 nodules which an expert radiologist could not identify. Conclusion: In this pilot study we identified a radiomic approach allowing ML systems to perform well in the diagnosis of a non-specific nodule at MR mammography. This algorithm could be a great support for the early diagnosis of malignant breast tumor, in the event the radiologist is not able to identify the kind of lesion and reduces the necessity for long follow-up. Clinical Relevance: This machine learning algorithm could be essential to support the radiologist in early diagnosis of non-specific nodules, in order to avoid strenuous follow-up and painful biopsy for the patient.Keywords: breast, machine learning, MRI, radiomics
Procedia PDF Downloads 267721 A Geophysical Study for Delineating the Subsurface Minerals at El Qusier Area, Central Eastern Desert, Egypt
Authors: Ahmed Khalil, Elhamy Tarabees, Svetlana Kovacikova
Abstract:
The Red Sea Mountains have been famous for their ore deposits since ancient times. Also, petrographic analysis and previous potential field surveys indicated large unexplored accumulations of ore minerals in the area. Therefore, the main goal of the presented study is to contribute to the discovery of hitherto unknown ore mineral deposits in the Red Sea region. To achieve this goal, we used two geophysical techniques: land magnetic survey and magnetotelluric data. A high-resolution land magnetic survey has been acquired using two proton magnetometers, one instrument used as a base station for the diurnal correction and the other used to measure the magnetic field along the study area. Two hundred eighty land magnetic stations were measured over a mesh-like area with a 500m spacing interval. The necessary reductions concerning daily variation, regional gradient and time observation were applied. Then, the total intensity anomaly map was constructed and transformed into the reduced magnetic pole (RTP). The magnetic interpretation was carried out using the analytical signal as well as regional–residual separation is carried out using the power spectrum. Also, the tilt derivative method (TDR) technique is applied to delineate the structure and hidden anomalies. Data analysis has been performed using trend analysis and Euler deconvolution. The results indicate that magnetic contacts are not the dominant geological feature of the study area. The magnetotleruric survey consisted of two profiles with a total of 8 broadband measurement points with a duration of about 24 hours crossing a wadi um Gheig approximately 50 km south of El Quseir. Collected data have been inverted to the electrical resistivity model using the 3D modular 3D inversion technique ModEM. The model revealed a non-conductive body in its central part, probably corresponding to a dolerite dyke, with which possible ore mineralization could be related.Keywords: magnetic survey, magnetotelluric, mineralization, 3d modeling
Procedia PDF Downloads 27720 Optimizing Oil Production through 30-Inch Pipeline in Abu-Attifel Field
Authors: Ahmed Belgasem, Walid Ben Hussin, Emad Krekshi, Jamal Hashad
Abstract:
Waxy crude oil, characterized by its high paraffin wax content, poses significant challenges in the oil & gas industry due to its increased viscosity and semi-solid state at reduced temperatures. The wax formation process, which includes precipitation, crystallization, and deposition, becomes problematic when crude oil temperatures fall below the wax appearance temperature (WAT) or cloud point. Addressing these issues, this paper introduces a technical solution designed to mitigate the wax appearance and enhance the oil production process in Abu-Attifil Field via a 30-inch crude oil pipeline. A comprehensive flow assurance study validates the feasibility and performance of this solution across various production rates, temperatures, and operational scenarios. The study's findings indicate that maintaining the crude oil's temperature above a minimum threshold of 63°C is achievable through the strategic placement of two heating stations along the pipeline route. This approach effectively prevents wax deposition, gelling, and subsequent mobility complications, thereby bolstering the overall efficiency, reliability, safety, and economic viability of the production process. Moreover, this solution significantly curtails the environmental repercussions traditionally associated with wax deposition, which can accumulate up to 7,500kg. The research methodology involves a comprehensive flow assurance study to validate the feasibility and performance of the proposed solution. The study considers various production rates, temperatures, and operational scenarios. It includes crude oil analysis to determine the wax appearance temperature (WAT), as well as the evaluation and comparison of operating options for the heating stations. The study's findings indicate that the proposed solution effectively prevents wax deposition, gelling, and subsequent mobility complications. By maintaining the crude oil's temperature above the specified threshold, the solution improves the overall efficiency, reliability, safety, and economic viability of the oil production process. Additionally, the solution contributes to reducing environmental repercussions associated with wax deposition. The research conclusion presents a technical solution that optimizes oil production in the Abu-Attifil Field by addressing wax formation problems through the strategic placement of two heating stations. The solution effectively prevents wax deposition, improves overall operational efficiency, and contributes to environmental sustainability. Further research is suggested for field data validation and cost-benefit analysis exploration.Keywords: oil production, wax depositions, solar cells, heating stations
Procedia PDF Downloads 73719 Corruption, Institutional Quality and Economic Growth in Nigeria
Authors: Ogunlana Olarewaju Fatai, Kelani Fatai Adeshina
Abstract:
The interplay of corruption and institutional quality determines how effective and efficient an economy progresses. An efficient institutional quality is a key requirement for economic stability. Institutional quality in most cases has been used interchangeably with Governance and these have given room for proxies that legitimized Governance as measures for institutional quality. A poorly-tailored institutional quality has a penalizing effect on corruption and economic growth, while defective institutional quality breeds corruption. Corruption is a hydra-headed phenomenon as it manifests in different forms. The most celebrated definition of corruption is given as “the use or abuse of public office for private benefits or gains”. It also denotes an arrangement between two mutual parties in the determination and allocation of state resources for pecuniary benefits to circumvent state efficiency. This study employed Barro (1990) type augmented model to analyze the nexus among corruption, institutional quality and economic growth in Nigeria using annual time series data, which spanned the period 1996-2019. Within the analytical framework of Johansen Cointegration technique, Error Correction Mechanism (ECM) and Granger Causality tests, findings revealed a long-run relationship between economic growth, corruption and selected measures of institutional quality. The long run results suggested that all the measures of institutional quality except voice & accountability and regulatory quality are positively disposed to economic growth. Moreover, the short-run estimation indicated a reconciliation of the divergent views on corruption which pointed at “sand the wheel” and “grease the wheel” of growth. In addition, regulatory quality and the rule of law indicated a negative influence on economic growth in Nigeria. Government effectiveness and voice & accountability, however, indicated a positive influence on economic growth. The Granger causality test results suggested a one-way causality between GDP and Corruption and also between corruption and institutional quality. Policy implications from this study pointed at checking corruption and streamlining institutional quality framework for better and sustained economic development.Keywords: institutional quality, corruption, economic growth, public policy
Procedia PDF Downloads 170718 Evaluation of the Anti Ulcer Activity of Ethyl Acetate Fraction of Methanol Leaf Extract of Clerodendrum Capitatum
Authors: M. N. Ofokansi, Onyemelukwe Chisom, Amauche Chukwuemeka, Ezema Onyinye
Abstract:
The leaves of Clerodendrumcapitatum(Lamiaceae) is mostly used in the treatment of gastric ulcer in Nigerian folk medicine. The aim of this study was to evaluate the antiulcer activity of its crude methanol leaf extract and its ethyl acetate fraction in white albino rats. The effect of crude methanol leaf extract and its ethyl acetate fraction(250mg/kg, 500mg/kg) was evaluated using an absolute ethanol induced ulcer model. Crude methanol leaf extract and the ethyl acetate fraction was treated with distilled water and 6% Tween 80, respectively. crude methanol leaf extract was further investigated using a pylorus ligation induced ulcer model. Omeprazole was used as the standard treatment. Four groups of five albino rats of either sex were used. Parameters such as mean ulcer index and percentage ulcer protection were assessed in the ethanol-induced ulcer model, while the gastric volume, pH, and total acidity were assessed in the pyloric ligation induced ulcer model. Crude methanol leaf extract of Clerodendrumcapitatum(500mg/kg) showed a very highly significant reduction in mean ulcer index(p<0.001) in the absolute ethanol-induced model. ethyl acetate fraction of crude methanol leaf extract of Clerodendrumcapitatum(250mg/kg,500mg/kg) showed a very highly significant dose-dependent reduction in mean ulcer indices (p<0.001) in the absolute ethanol-induced model. The mean ulcer indices (1.6,2.2) with dose concentration (250mg/kg, 500mg/kg) of ethyl acetate fraction increased with ulcer protection (82.85%,76.42%) respectively when compared to the control group in the absolute ethanol-induced ulcer model. Crude methanol leaf extract of Clerodendrumcapitatum(250mg/kg, 500mg/kg) treated animals showed a highly significant dose-dependent reduction in mean ulcer index(p<0.01) with an increase in ulcer protection (56.77%,63.22%) respectively in pyloric ligated induced, ulcer model. Gastric parameters such as volume of gastric juice, pH, and total acidity were of no significance in the different doses of the crude methanol leaf extract when compared to the control group. The phytochemical investigation showed that the crude methanol leaf extracts Possess Saponins and Flavonoids while its ethyl acetate fraction possess only Flavonoids. The results of the study indicate that the crude methanol leaf extract and its ethyl acetate fraction is effective and has gastro protective and ulcer healing capacity. Ethyl acetate fraction is more potent than crude methanol leaf extract against ethanol-induced This result provides scientific evidence as a validation for its folkloric use in the treatment of gastric ulcer.Keywords: gastroprotective, herbal medicine, anti-ulcer, pharmacology
Procedia PDF Downloads 162717 A Perspective on Teaching Mathematical Concepts to Freshman Economics Students Using 3D-Visualisations
Authors: Muhammad Saqib Manzoor, Camille Dickson-Deane, Prashan Karunaratne
Abstract:
Cobb-Douglas production (utility) function is a fundamental function widely used in economics teaching and research. The key reason is the function's characteristics to describe the actual production using inputs like labour and capital. The characteristics of the function like returns to scale, marginal, and diminishing marginal productivities are covered in the introductory units in both microeconomics and macroeconomics with a 2-dimensional static visualisation of the function. However, less insight is provided regarding three-dimensional surface, changes in the curvature properties due to returns to scale, the linkage of the short-run production function with its long-run counterpart and marginal productivities, the level curves, and the constraint optimisation. Since (freshman) learners have diverse prior knowledge and cognitive skills, the existing “one size fits all” approach is not very helpful. The aim of this study is to bridge this gap by introducing technological intervention with interactive animations of the three-dimensional surface and sequential unveiling of the characteristics mentioned above using Python software. A small classroom intervention has helped students enhance their analytical and visualisation skills towards active and authentic learning of this topic. However, to authenticate the strength of our approach, a quasi-Delphi study will be conducted to ask domain-specific experts, “What value to the learning process in economics is there using a 2-dimensional static visualisation compared to using a 3-dimensional dynamic visualisation?’ Here three perspectives of the intervention were reviewed by a panel comprising of novice students, experienced students, novice instructors, and experienced instructors in an effort to determine the learnings from each type of visualisations within a specific domain of knowledge. The value of this approach is key to suggesting different pedagogical methods which can enhance learning outcomes.Keywords: cobb-douglas production function, quasi-Delphi method, effective teaching and learning, 3D-visualisations
Procedia PDF Downloads 145716 Identification of Deposition Sequences of the Organic Content of Lower Albian-Cenomanian Age in Northern Tunisia: Correlation between Molecular and Stratigraphic Fossils
Authors: Tahani Hallek, Dhaou Akrout, Riadh Ahmadi, Mabrouk Montacer
Abstract:
The present work is an organic geochemical study of the Fahdene Formation outcrops at the Mahjouba region belonging to the Eastern part of the Kalaat Senan structure in northwestern Tunisia (the Kef-Tedjerouine area). The analytical study of the organic content of the samples collected, allowed us to point out that the Formation in question is characterized by an average to good oil potential. This fossilized organic matter has a mixed origin (type II and III), as indicated by the relatively high values of hydrogen index. This origin is confirmed by the C29 Steranes abundance and also by tricyclic terpanes C19/(C19+C23) and tetracyclic terpanes C24/(C24+C23) ratios, that suggest a marine environment of deposit with high plants contribution. We have demonstrated that the heterogeneity of organic matter between the marine aspect, confirmed by the presence of foraminifera, and the continental contribution, is the result of an episodic anomaly in relation to the sequential stratigraphy. Given that the study area is defined as an outer platform forming a transition zone between a stable continental domain to the south and a deep basin to the north, we have explained the continental contribution by successive forced regressions, having blocked the albian transgression, allowing the installation of the lowstand system tracts. This aspect is represented by the incised valleys filling, in direct contact with the pelagic and deep sea facies. Consequently, the Fahdene Formation, in the Kef-Tedjerouine area, consists of transgressive system tracts (TST) brutally truncated by extras of continental progradation; resulting in a mixed influence deposition having retained a heterogeneous organic material.Keywords: molecular geochemistry, biomarkers, forced regression, deposit environment, mixed origin, Northern Tunisia
Procedia PDF Downloads 249