Search results for: modeling technique
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 10195

Search results for: modeling technique

7585 Raman Tweezers Spectroscopy Study of Size Dependent Silver Nanoparticles Toxicity on Erythrocytes

Authors: Surekha Barkur, Aseefhali Bankapur, Santhosh Chidangil

Abstract:

Raman Tweezers technique has become prevalent in single cell studies. This technique combines Raman spectroscopy which gives information about molecular vibrations, with optical tweezers which use a tightly focused laser beam for trapping the single cells. Thus Raman Tweezers enabled researchers analyze single cells and explore different applications. The applications of Raman Tweezers include studying blood cells, monitoring blood-related disorders, silver nanoparticle-induced stress, etc. There is increased interest in the toxic effect of nanoparticles with an increase in the various applications of nanoparticles. The interaction of these nanoparticles with the cells may vary with their size. We have studied the effect of silver nanoparticles of sizes 10nm, 40nm, and 100nm on erythrocytes using Raman Tweezers technique. Our aim was to investigate the size dependence of the nanoparticle effect on RBCs. We used 785nm laser (Starbright Diode Laser, Torsana Laser Tech, Denmark) for both trapping and Raman spectroscopic studies. 100 x oil immersion objectives with high numerical aperture (NA 1.3) is used to focus the laser beam into a sample cell. The back-scattered light is collected using the same microscope objective and focused into the spectrometer (Horiba Jobin Vyon iHR320 with 1200grooves/mm grating blazed at 750nm). Liquid nitrogen cooled CCD (Symphony CCD-1024x256-OPEN-1LS) was used for signal detection. Blood was drawn from healthy volunteers in vacutainer tubes and centrifuged to separate the blood components. 1.5 ml of silver nanoparticles was washed twice with distilled water leaving 0.1 ml silver nanoparticles in the bottom of the vial. The concentration of silver nanoparticles is 0.02mg/ml so the 0.03mg of nanoparticles will be present in the 0.1 ml nanoparticles obtained. The 25 ul of RBCs were diluted in 2 ml of PBS solution and then treated with 50 ul (0.015mg) of nanoparticles and incubated in CO2 incubator. Raman spectroscopic measurements were done after 24 hours and 48 hours of incubation. All the spectra were recorded with 10mW laser power (785nm diode laser), 60s of accumulation time and 2 accumulations. Major changes were observed in the peaks 565 cm-1, 1211 cm-1, 1224 cm-1, 1371 cm-1, 1638 cm-1. A decrease in intensity of 565 cm-1, increase in 1211 cm-1 with a reduction in 1224 cm-1, increase in intensity of 1371 cm-1 also peak disappearing at 1635 cm-1 indicates deoxygenation of hemoglobin. Nanoparticles with higher size were showing maximum spectral changes. Lesser changes observed in case of 10nm nanoparticle-treated erythrocyte spectra.

Keywords: erythrocytes, nanoparticle-induced toxicity, Raman tweezers, silver nanoparticles

Procedia PDF Downloads 297
7584 Determination of Optimum Parameters for Thermal Stress Distribution in Composite Plate Containing a Triangular Cutout by Optimization Method

Authors: Mohammad Hossein Bayati Chaleshtari, Hadi Khoramishad

Abstract:

Minimizing the stress concentration around triangular cutout in infinite perforated plates subjected to a uniform heat flux induces thermal stresses is an important consideration in engineering design. Furthermore, understanding the effective parameters on stress concentration and proper selection of these parameters enables the designer to achieve a reliable design. In the analysis of thermal stress, the effective parameters on stress distribution around cutout include fiber angle, flux angle, bluntness and rotation angle of the cutout for orthotropic materials. This paper was tried to examine effect of these parameters on thermal stress analysis of infinite perforated plates with central triangular cutout. In order to achieve the least amount of thermal stress around a triangular cutout using a novel swarm intelligence optimization technique called dragonfly optimizer that inspired by the life method and hunting behavior of dragonfly in nature. In this study, using the two-dimensional thermoelastic theory and based on the Likhnitskiiʼ complex variable technique, the stress analysis of orthotropic infinite plate with a circular cutout under a uniform heat flux was developed to the plate containing a quasi-triangular cutout in thermal steady state condition. To achieve this goal, a conformal mapping function was used to map an infinite plate containing a quasi- triangular cutout into the outside of a unit circle. The plate is under uniform heat flux at infinity and Neumann boundary conditions and thermal-insulated condition at the edge of the cutout were considered.

Keywords: infinite perforated plate, complex variable method, thermal stress, optimization method

Procedia PDF Downloads 155
7583 Enhancing Traditional Saudi Designs Pattern Cutting to Integrate Them Into Current Clothing Offers

Authors: Faizah Almalki, Simeon Gill, Steve G. Hayes, Lisa Taylor

Abstract:

A core element of cultural identity is the traditional costumes that provide insight into the heritage that has been acquired over time. This heritage is apparent in the use of colour, the styles and the functions of the clothing and it also reflects the skills of those who created the items and the time taken to produce them. Modern flat pattern drafting methods for making garment patterns are simple in comparison to the relatively laborious traditional approaches that would require personal interaction with the wearer throughout the production process. The current study reflects on the main elements of the pattern cutting system and how this has evolved in Saudi Arabia to affect the design of the Sawan garment. Analysis of the traditional methods for constructing Sawan garments was undertaken through observation of the practice and the garments and consulting documented guidance. This provided a foundation through which to explore how modern technology can be applied to improve the process. In this research, modern methods are proposed for producing traditional Saudi garments more efficiently while retaining elements of the conventional style and design. The current study has documented the vital aspects of Sawan garment style. The result showed that the method had been used to take the body measurements and pattern making was elementary and offered simple geometric shape and the Sawan garment is composed of four pieces. Consequently, this research allows for classical pattern shapes to be embedded in garments now worn in Saudi Arabia and for the continuation of cultural heritage.

Keywords: traditional Sawan garment technique, modern pattern cutting technique, the shape of the garment and software, Lectra Modaris

Procedia PDF Downloads 137
7582 Measurement and Analysis of Human Hand Kinematics

Authors: Tamara Grujic, Mirjana Bonkovic

Abstract:

Measurements and quantitative analysis of kinematic parameters of human hand movements have an important role in different areas such as hand function rehabilitation, modeling of multi-digits robotic hands, and the development of machine-man interfaces. In this paper the assessment and evaluation of the reach-to-grasp movement by using computerized and robot-assisted method is described. Experiment involved the measurements of hand positions of seven healthy subjects during grasping three objects of different shapes and sizes. Results showed that three dominant phases of reach-to-grasp movements could be clearly identified.

Keywords: human hand, kinematics, measurement and analysis, reach-to-grasp movement

Procedia PDF Downloads 468
7581 Digital Holographic Interferometric Microscopy for the Testing of Micro-Optics

Authors: Varun Kumar, Chandra Shakher

Abstract:

Micro-optical components such as microlenses and microlens array have numerous engineering and industrial applications for collimation of laser diodes, imaging devices for sensor system (CCD/CMOS, document copier machines etc.), for making beam homogeneous for high power lasers, a critical component in Shack-Hartmann sensor, fiber optic coupling and optical switching in communication technology. Also micro-optical components have become an alternative for applications where miniaturization, reduction of alignment and packaging cost are necessary. The compliance with high-quality standards in the manufacturing of micro-optical components is a precondition to be compatible on worldwide markets. Therefore, high demands are put on quality assurance. For quality assurance of these lenses, an economical measurement technique is needed. For cost and time reason, technique should be fast, simple (for production reason), and robust with high resolution. The technique should provide non contact, non-invasive and full field information about the shape of micro- optical component under test. The interferometric techniques are noncontact type and non invasive and provide full field information about the shape of the optical components. The conventional interferometric technique such as holographic interferometry or Mach-Zehnder interferometry is available for characterization of micro-lenses. However, these techniques need more experimental efforts and are also time consuming. Digital holography (DH) overcomes the above described problems. Digital holographic microscopy (DHM) allows one to extract both the amplitude and phase information of a wavefront transmitted through the transparent object (microlens or microlens array) from a single recorded digital hologram by using numerical methods. Also one can reconstruct the complex object wavefront at different depths due to numerical reconstruction. Digital holography provides axial resolution in nanometer range while lateral resolution is limited by diffraction and the size of the sensor. In this paper, Mach-Zehnder based digital holographic interferometric microscope (DHIM) system is used for the testing of transparent microlenses. The advantage of using the DHIM is that the distortions due to aberrations in the optical system are avoided by the interferometric comparison of reconstructed phase with and without the object (microlens array). In the experiment, first a digital hologram is recorded in the absence of sample (microlens array) as a reference hologram. Second hologram is recorded in the presence of microlens array. The presence of transparent microlens array will induce a phase change in the transmitted laser light. Complex amplitude of object wavefront in presence and absence of microlens array is reconstructed by using Fresnel reconstruction method. From the reconstructed complex amplitude, one can evaluate the phase of object wave in presence and absence of microlens array. Phase difference between the two states of object wave will provide the information about the optical path length change due to the shape of the microlens. By the knowledge of the value of the refractive index of microlens array material and air, the surface profile of microlens array is evaluated. The Sag of microlens and radius of curvature of microlens are evaluated and reported. The sag of microlens agrees well within the experimental limit as provided in the specification by the manufacturer.

Keywords: micro-optics, microlens array, phase map, digital holographic interferometric microscopy

Procedia PDF Downloads 505
7580 Development of Medical Intelligent Process Model Using Ontology Based Technique

Authors: Emmanuel Chibuogu Asogwa, Tochukwu Sunday Belonwu

Abstract:

An urgent demand for creative solutions has been created by the rapid expansion of medical knowledge, the complexity of patient care, and the requirement for more precise decision-making. As a solution to this problem, the creation of a Medical Intelligent Process Model (MIPM) utilizing ontology-based appears as a promising way to overcome this obstacle and unleash the full potential of healthcare systems. The development of a Medical Intelligent Process Model (MIPM) using ontology-based techniques is motivated by a lack of quick access to relevant medical information and advanced tools for treatment planning and clinical decision-making, which ontology-based techniques can provide. The aim of this work is to develop a structured and knowledge-driven framework that leverages ontology, a formal representation of domain knowledge, to enhance various aspects of healthcare. Object-Oriented Analysis and Design Methodology (OOADM) were adopted in the design of the system as we desired to build a usable and evolvable application. For effective implementation of this work, we used the following materials/methods/tools: the medical dataset for the test of our model in this work was obtained from Kaggle. The ontology-based technique was used with Confusion Matrix, MySQL, Python, Hypertext Markup Language (HTML), Hypertext Preprocessor (PHP), Cascaded Style Sheet (CSS), JavaScript, Dreamweaver, and Fireworks. According to test results on the new system using Confusion Matrix, both the accuracy and overall effectiveness of the medical intelligent process significantly improved by 20% compared to the previous system. Therefore, using the model is recommended for healthcare professionals.

Keywords: ontology-based, model, database, OOADM, healthcare

Procedia PDF Downloads 82
7579 Kou Jump Diffusion Model: An Application to the SP 500; Nasdaq 100 and Russell 2000 Index Options

Authors: Wajih Abbassi, Zouhaier Ben Khelifa

Abstract:

The present research points towards the empirical validation of three options valuation models, the ad-hoc Black-Scholes model as proposed by Berkowitz (2001), the constant elasticity of variance model of Cox and Ross (1976) and the Kou jump-diffusion model (2002). Our empirical analysis has been conducted on a sample of 26,974 options written on three indexes, the S&P 500, Nasdaq 100 and the Russell 2000 that were negotiated during the year 2007 just before the sub-prime crisis. We start by presenting the theoretical foundations of the models of interest. Then we use the technique of trust-region-reflective algorithm to estimate the structural parameters of these models from cross-section of option prices. The empirical analysis shows the superiority of the Kou jump-diffusion model. This superiority arises from the ability of this model to portray the behavior of market participants and to be closest to the true distribution that characterizes the evolution of these indices. Indeed the double-exponential distribution covers three interesting properties that are: the leptokurtic feature, the memory less property and the psychological aspect of market participants. Numerous empirical studies have shown that markets tend to have both overreaction and under reaction over good and bad news respectively. Despite of these advantages there are not many empirical studies based on this model partly because probability distribution and option valuation formula are rather complicated. This paper is the first to have used the technique of nonlinear curve-fitting through the trust-region-reflective algorithm and cross-section options to estimate the structural parameters of the Kou jump-diffusion model.

Keywords: jump-diffusion process, Kou model, Leptokurtic feature, trust-region-reflective algorithm, US index options

Procedia PDF Downloads 431
7578 Enhancement of Light Extraction of Luminescent Coating by Nanostructuring

Authors: Aubry Martin, Nehed Amara, Jeff Nyalosaso, Audrey Potdevin, FrançOis ReVeret, Michel Langlet, Genevieve Chadeyron

Abstract:

Energy-saving lighting devices based on LightEmitting Diodes (LEDs) combine a semiconductor chip emitting in the ultraviolet or blue wavelength region to one or more phosphor(s) deposited in the form of coatings. The most common ones combine a blue LED with the yellow phosphor Y₃Al₅O₁₂:Ce³⁺ (YAG:Ce) and a red phosphor. Even if these devices are characterized by satisfying photometric parameters (Color Rendering Index, Color Temperature) and good luminous efficiencies, further improvements can be carried out to enhance light extraction efficiency (increase in phosphor forward emission). One of the possible strategies is to pattern the phosphor coatings. Here, we have worked on different ways to nanostructure the coating surface. On the one hand, we used the colloidal lithography combined with the Langmuir-Blodgett technique to directly pattern the surface of YAG:Tb³⁺ sol-gel derived coatings, YAG:Tb³⁺ being used as phosphor model. On the other hand, we achieved composite architectures combining YAG:Ce coatings and ZnO nanowires. Structural, morphological and optical properties of both systems have been studied and compared to flat YAG coatings. In both cases, nanostructuring brought a significative enhancement of photoluminescence properties under UV or blue radiations. In particular, angle-resolved photoluminescence measurements have shown that nanostructuring modifies photons path within the coatings, with a better extraction of the guided modes. These two strategies have the advantage of being versatile and applicable to any phosphor synthesizable by sol-gel technique. They then appear as promising ways to enhancement luminescence efficiencies of both phosphor coatings and the optical devices into which they are incorporated, such as LED-based lighting or safety devices.

Keywords: phosphor coatings, nanostructuring, light extraction, ZnO nanowires, colloidal lithography, LED devices

Procedia PDF Downloads 179
7577 An Analysis of the Causes of SMEs Failure in Developing Countries: The Case of South Africa

Authors: Paul Saah, Charles Mbohwa, Nelson Sizwe Madonsela

Abstract:

In the context of developing countries, this study explores a crucial component of economic development by examining the reasons behind the failure of small and medium-sized enterprises (SMEs). SMEs are acknowledged as essential drivers of economic expansion, job creation, and poverty alleviation in emerging countries. This research uses South Africa as a case study to evaluate the reasons why SMEs fail in developing nations. This study explores a quantitative research methodology to investigate the complex causes of SME failures using statistical tools and reliability tests. To ensure the viability of data collection, a sample size of 400 small business owners was chosen using a non-probability selection technique. A closed-ended questionnaire was the primary technique used to obtain detailed information from the participants. Data was analysed and interpreted using computer software packages such as the Statistical Package for the Social Sciences (SPSS). According to the findings, the main reasons why SMEs fail in developing nations are a lack of strategic business planning, a lack of funding, poor management, a lack of innovation, a lack of business research and a low level of education and training. The results of this study show that SMEs can be sustainable and successful as long as they comprehend and use the suggested small business success determining variables into their daily operations. This implies that the more SMEs in developing countries implement the proposed determinant factors of small business success in their business operations the more the businesses are likely to succeed and vice versa.

Keywords: failure, developing countries, SMEs, economic development, South Africa

Procedia PDF Downloads 81
7576 Explosion Mechanics of Aluminum Plates Subjected to the Combined Effect of Blast Wave and Fragment Impact Loading: A Multicase Computational Modeling Study

Authors: Atoui Oussama, Maazoun Azer, Belkassem Bachir, Pyl Lincy, Lecompte David

Abstract:

For many decades, researchers have been focused on understanding the dynamic behavior of different structures and materials subjected to fragment impact or blast loads separately. The explosion mechanics, as well as the impact physics studies dealing with the numerical modeling of the response of protective structures under the synergistic effect of a blast wave and the impact of fragments, are quite limited in the literature. This article numerically evaluates the nonlinear dynamic behavior and damage mechanisms of Aluminum plates EN AW-1050A- H24 under different combined loading scenarios varied by the sequence of the applied loads using the commercial software LS-DYNA. For one hand, with respect to the terminal ballistic field investigations, a Lagrangian (LAG) formulation is used to evaluate the different failure modes of the target material in case of a fragment impact. On the other hand, with respect to the blast field analysis, an Arbitrary Lagrangian-Eulerian (ALE) formulation is considered to study the fluid-structure interaction (FSI) of the shock wave and the plate in case of a blast loading. Four different loading scenarios are considered: (1) only blast loading, (2) only fragment impact, (3) blast loading followed by a fragment impact and (4) a fragment impact followed by blast loading. From the numerical results, it was observed that when the impact load is applied to the plate prior to the blast load, it suffers more severe damage due to the hole enlargement phenomenon and the effects of crack propagation on the circumference of the damaged zone. Moreover, it was found that the hole from the fragment impact loading was enlarged to about three times in diameter as compared to the diameter of the projectile. The validation of the proposed computational model is based in part on previous experimental data obtained by the authors and in the other part on experimental data obtained from the literature. A good correspondence between the numerical and experimental results is found.

Keywords: computational analysis, combined loading, explosion mechanics, hole enlargement phenomenon, impact physics, synergistic effect, terminal ballistic

Procedia PDF Downloads 189
7575 Modeling and Prediction of Zinc Extraction Efficiency from Concentrate by Operating Condition and Using Artificial Neural Networks

Authors: S. Mousavian, D. Ashouri, F. Mousavian, V. Nikkhah Rashidabad, N. Ghazinia

Abstract:

PH, temperature, and time of extraction of each stage, agitation speed, and delay time between stages effect on efficiency of zinc extraction from concentrate. In this research, efficiency of zinc extraction was predicted as a function of mentioned variable by artificial neural networks (ANN). ANN with different layer was employed and the result show that the networks with 8 neurons in hidden layer has good agreement with experimental data.

Keywords: zinc extraction, efficiency, neural networks, operating condition

Procedia PDF Downloads 550
7574 Transcranial and Sacral Magnetic Stimulation as a Therapeutic Resource for Urinary Incontinence – A Brief Bibliographic Review

Authors: Ana Lucia Molina

Abstract:

Transcranial magnetic stimulation (TMS) is a non-invasive neuromodulation technique for the investigation and modulation of cortical excitability in humans. The modulation of the processing of different cortical areas can result in several areas for rehabilitation, showing great potential in the treatment of motor disorders. In the human brain, the supplementary motor area (SMA) is involved in the control of the pelvic floor muscles (MAP), where dysfunctions of these muscles can lead to urinary incontinence. Peripheral magnetic stimulation, specifically sacral magnetic stimulation, has been used as a safe and effective treatment option for patients with lower urinary tract dysfunction. A systematic literature review was carried out (Pubmed, Medline and Google academic database) without a time limit using the keywords: "transcranial magnetic stimulation", "sacral neuromodulation", and "urinary incontinence", where 11 articles attended to the inclusion criteria. Results: Thirteen articles were selected. Magnetic stimulation is a non-invasive neuromodulation technique widely used in the evaluation of cortical areas and their respective peripheral areas, as well as in the treatment of lesions of brain origin. With regard to pelvic-perineal disorders, repetitive transcranial stimulation showed significant effects in controlling urinary incontinence, as well as sacral peripheral magnetic stimulation, in addition to exerting the potential to restore bladder sphincter function. Conclusion: Data from the literature suggest that both transcranial stimulation and peripheral stimulation are non-invasive references that can be promising and effective means of treatment in pelvic and perineal disorders. More prospective and randomized studies on a larger scale are needed, adapting the most appropriate and resolving parameters.

Keywords: urinary incontinence, non-invasive neuromodulation, sacral neuromodulation, transcranial magnetic stimulation.

Procedia PDF Downloads 105
7573 Predictive Analytics for Theory Building

Authors: Ho-Won Jung, Donghun Lee, Hyung-Jin Kim

Abstract:

Predictive analytics (data analysis) uses a subset of measurements (the features, predictor, or independent variable) to predict another measurement (the outcome, target, or dependent variable) on a single person or unit. It applies empirical methods in statistics, operations research, and machine learning to predict the future, or otherwise unknown events or outcome on a single or person or unit, based on patterns in data. Most analyses of metabolic syndrome are not predictive analytics but statistical explanatory studies that build a proposed model (theory building) and then validate metabolic syndrome predictors hypothesized (theory testing). A proposed theoretical model forms with causal hypotheses that specify how and why certain empirical phenomena occur. Predictive analytics and explanatory modeling have their own territories in analysis. However, predictive analytics can perform vital roles in explanatory studies, i.e., scientific activities such as theory building, theory testing, and relevance assessment. In the context, this study is to demonstrate how to use our predictive analytics to support theory building (i.e., hypothesis generation). For the purpose, this study utilized a big data predictive analytics platform TM based on a co-occurrence graph. The co-occurrence graph is depicted with nodes (e.g., items in a basket) and arcs (direct connections between two nodes), where items in a basket are fully connected. A cluster is a collection of fully connected items, where the specific group of items has co-occurred in several rows in a data set. Clusters can be ranked using importance metrics, such as node size (number of items), frequency, surprise (observed frequency vs. expected), among others. The size of a graph can be represented by the numbers of nodes and arcs. Since the size of a co-occurrence graph does not depend directly on the number of observations (transactions), huge amounts of transactions can be represented and processed efficiently. For a demonstration, a total of 13,254 metabolic syndrome training data is plugged into the analytics platform to generate rules (potential hypotheses). Each observation includes 31 predictors, for example, associated with sociodemographic, habits, and activities. Some are intentionally included to get predictive analytics insights on variable selection such as cancer examination, house type, and vaccination. The platform automatically generates plausible hypotheses (rules) without statistical modeling. Then the rules are validated with an external testing dataset including 4,090 observations. Results as a kind of inductive reasoning show potential hypotheses extracted as a set of association rules. Most statistical models generate just one estimated equation. On the other hand, a set of rules (many estimated equations from a statistical perspective) in this study may imply heterogeneity in a population (i.e., different subpopulations with unique features are aggregated). Next step of theory development, i.e., theory testing, statistically tests whether a proposed theoretical model is a plausible explanation of a phenomenon interested in. If hypotheses generated are tested statistically with several thousand observations, most of the variables will become significant as the p-values approach zero. Thus, theory validation needs statistical methods utilizing a part of observations such as bootstrap resampling with an appropriate sample size.

Keywords: explanatory modeling, metabolic syndrome, predictive analytics, theory building

Procedia PDF Downloads 283
7572 Steady State Analysis of Distribution System with Wind Generation Uncertainity

Authors: Zakir Husain, Neem Sagar, Neeraj Gupta

Abstract:

Due to the increased penetration of renewable energy resources in the distribution system, the system is no longer passive in nature. In this paper, a steady state analysis of the distribution system has been done with the inclusion of wind generation. The modeling of wind turbine generator system and wind generator has been made to obtain the average active and the reactive power injection into the system. The study has been conducted on a IEEE-33 bus system with two wind generators. The present research work is useful not only to utilities but also to customers.

Keywords: distributed generation, distribution network, radial network, wind turbine generating system

Procedia PDF Downloads 411
7571 Comparison of Hydrogen and Electrification Perspectives in Decarbonizing the Transport Sector

Authors: Matteo Nicoli, Gianvito Colucci, Valeria Di Cosmo, Daniele Lerede, Laura Savoldi

Abstract:

The transport sector is currently responsible for approximately 1/3 of greenhouse gas emissions in Europe. In the wider context of achieving carbon neutrality of the global energy system, different alternatives are available to decarbonizethe transport sector. In particular, while electricity is already the most consumed energy commodity in rail transport, battery electric vehicles are one of the zero-emissions options on the market for road transportation. On the other hand, hydrogen-based fuel cell vehicles are available for road and non-road vehicles. The European Commission is strongly pushing toward the integration of hydrogen in the energy systems of European countries and its widespread adoption as an energy vector to achieve the Green Deal targets. Furthermore, the Italian government is defining hydrogen-related objectives with the publication of a dedicated Hydrogen Strategy. The adoption of energy system optimization models to study the possible penetration of alternative zero-emitting transport technologies gives the opportunity to perform an overall analysis of the effects that the development of innovative technologies has on the entire energy system and on the supply-side, devoted to the production of energy carriers such as hydrogen and electricity. Using an open-source modeling framework such as TEMOA, this work aims to compare the role of hydrogen and electric vehicles in the decarbonization of the transport sector. The analysis investigates the advantages and disadvantages of adopting the two options, from the economic point of view (costs associated with the two options) and the environmental one (looking at the emissions reduction perspectives). Moreover, an analysis on the profitability of the investments in hydrogen and electric vehicles will be performed. The study investigates the evolution of energy consumption and greenhouse gas emissions in different transportation modes (road, rail, navigation, and aviation) by detailed analysis of the full range of vehicles included in the techno-economic database used in the TEMOA model instance adopted for this work. The transparency of the analysis is guaranteed by the accessibility of the TEMOA models, based on an open-access source code and databases.

Keywords: battery electric vehicles, decarbonization, energy system optimization models, fuel cell vehicles, hydrogen, open-source modeling, TEMOA, transport

Procedia PDF Downloads 118
7570 The Impacts of Technology on Operations Costs: The Mediating Role of Operation Flexibility

Authors: Fazli Idris, Jihad Mohammad

Abstract:

The study aims to determine the impact of technology and service operations flexibility, which is divided into external flexibility and internal robustness, on operations costs. A mediation model is proposed that links technology to operations costs via operation flexibility. Drawing on a sample of 475 of operations managers of various service sectors in Malaysia and South Africa, Structural Equation Modeling (SEM) was employed to test the relationship using Smart-PLS procedures. It was found that a significant relationship was established between technologies to operations costs via both operations flexibility dimensions. Theoretical and managerial implications are offered to explain the results.

Keywords: Operations flexibility, technology, costs, mediation

Procedia PDF Downloads 617
7569 Quality Improvement of the Sand Moulding Process in Foundries Using Six Sigma Technique

Authors: Cindy Sithole, Didier Nyembwe, Peter Olubambi

Abstract:

The sand casting process involves pattern making, mould making, metal pouring and shake out. Every step in the sand moulding process is very critical for production of good quality castings. However, waste generated during the sand moulding operation and lack of quality are matters that influences performance inefficiencies and lack of competitiveness in South African foundries. Defects produced from the sand moulding process are only visible in the final product (casting) which results in increased number of scrap, reduced sales and increases cost in the foundry. The purpose of this Research is to propose six sigma technique (DMAIC, Define, Measure, Analyze, Improve and Control) intervention in sand moulding foundries and to reduce variation caused by deficiencies in the sand moulding process in South African foundries. Its objective is to create sustainability and enhance productivity in the South African foundry industry. Six sigma is a data driven method to process improvement that aims to eliminate variation in business processes using statistical control methods .Six sigma focuses on business performance improvement through quality initiative using the seven basic tools of quality by Ishikawa. The objectives of six sigma are to eliminate features that affects productivity, profit and meeting customers’ demands. Six sigma has become one of the most important tools/techniques for attaining competitive advantage. Competitive advantage for sand casting foundries in South Africa means improved plant maintenance processes, improved product quality and proper utilization of resources especially scarce resources. Defects such as sand inclusion, Flashes and sand burn on were some of the defects that were identified as resulting from the sand moulding process inefficiencies using six sigma technique. The courses were we found to be wrong design of the mould due to the pattern used and poor ramming of the moulding sand in a foundry. Six sigma tools such as the voice of customer, the Fishbone, the voice of the process and process mapping were used to define the problem in the foundry and to outline the critical to quality elements. The SIPOC (Supplier Input Process Output Customer) Diagram was also employed to ensure that the material and process parameters were achieved to ensure quality improvement in a foundry. The process capability of the sand moulding process was measured to understand the current performance to enable improvement. The Expected results of this research are; reduced sand moulding process variation, increased productivity and competitive advantage.

Keywords: defects, foundries, quality improvement, sand moulding, six sigma (DMAIC)

Procedia PDF Downloads 198
7568 Non-Linear Load-Deflection Response of Shape Memory Alloys-Reinforced Composite Cylindrical Shells under Uniform Radial Load

Authors: Behrang Tavousi Tehrani, Mohammad-Zaman Kabir

Abstract:

Shape memory alloys (SMA) are often implemented in smart structures as the active components. Their ability to recover large displacements has been used in many applications, including structural stability/response enhancement and active structural acoustic control. SMA wires or fibers can be embedded with composite cylinders to increase their critical buckling load, improve their load-deflection behavior, and reduce the radial deflections under various thermo-mechanical loadings. This paper presents a semi-analytical investigation on the non-linear load-deflection response of SMA-reinforced composite circular cylindrical shells. The cylinder shells are under uniform external pressure load. Based on first-order shear deformation shell theory (FSDT), the equilibrium equations of the structure are derived. One-dimensional simplified Brinson’s model is used for determining the SMA recovery force due to its simplicity and accuracy. Airy stress function and Galerkin technique are used to obtain non-linear load-deflection curves. The results are verified by comparing them with those in the literature. Several parametric studies are conducted in order to investigate the effect of SMA volume fraction, SMA pre-strain value, and SMA activation temperature on the response of the structure. It is shown that suitable usage of SMA wires results in a considerable enhancement in the load-deflection response of the shell due to the generation of the SMA tensile recovery force.

Keywords: airy stress function, cylindrical shell, Galerkin technique, load-deflection curve, recovery stress, shape memory alloy

Procedia PDF Downloads 193
7567 Microencapsulation of Phenobarbital by Ethyl Cellulose Matrix

Authors: S. Bouameur, S. Chirani

Abstract:

The aim of this study was to evaluate the potential use of EthylCellulose in the preparation of microspheres as a Drug Delivery System for sustained release of phenobarbital. The microspheres were prepared by solvent evaporation technique using ethylcellulose as polymer matrix with a ratio 1:2, dichloromethane as solvent and Polyvinyl alcohol 1% as processing medium to solidify the microspheres. Size, shape, drug loading capacity and entrapement efficiency were studied.

Keywords: phenobarbital, microspheres, ethylcellulose, polyvinylacohol

Procedia PDF Downloads 363
7566 The Use of Mobile Phone as Enhancement to Mark Multiple Choice Objectives English Grammar and Literature Examination: An Exploratory Case Study of Preliminary National Diploma Students, Abdu Gusau Polytechnic, Talata Mafara, Zamfara State, Nigeria

Authors: T. Abdulkadir

Abstract:

Most often, marking and assessment of multiple choice kinds of examinations have been opined by many as a cumbersome and herculean task to accomplished manually in Nigeria. Usually this may be in obvious nexus to the fact that mass numbers of candidates were known to take the same examination simultaneously. Eventually, marking such a mammoth number of booklets dared and dread even the fastest paid examiners who often undertake the job with the resulting consequences of stress and boredom. This paper explores the evolution, as well as the set aim to envision and transcend marking the Multiple Choice Objectives- type examination into a thing of creative recreation, or perhaps a more relaxing activity via the use of the mobile phone. A more “pragmatic” dimension method was employed to achieve this work, rather than the formal “in-depth research” based approach due to the “novelty” of the mobile-smartphone e-Marking Scheme discovery. Moreover, being an evolutionary scheme, no recent academic work shares a direct same topic concept with the ‘use of cell phone as an e-marking technique’ was found online; thus, the dearth of even miscellaneous citations in this work. Additional future advancements are what steered the anticipatory motive of this paper which laid the fundamental proposition. However, the paper introduces for the first time the concept of mobile-smart phone e-marking, the steps to achieve it, as well as the merits and demerits of the technique all spelt out in the subsequent pages.

Keywords: cell phone, e-marking scheme (eMS), mobile phone, mobile-smart phone, multiple choice objectives (MCO), smartphone

Procedia PDF Downloads 264
7565 Induction Machine Bearing Failure Detection Using Advanced Signal Processing Methods

Authors: Abdelghani Chahmi

Abstract:

This article examines the detection and localization of faults in electrical systems, particularly those using asynchronous machines. First, the process of failure will be characterized, relevant symptoms will be defined and based on those processes and symptoms, a model of those malfunctions will be obtained. Second, the development of the diagnosis of the machine will be shown. As studies of malfunctions in electrical systems could only rely on a small amount of experimental data, it has been essential to provide ourselves with simulation tools which allowed us to characterize the faulty behavior. Fault detection uses signal processing techniques in known operating phases.

Keywords: induction motor, modeling, bearing damage, airgap eccentricity, torque variation

Procedia PDF Downloads 141
7564 Thermal Performance of an Air-Water Heat Exchanger (AWHE) Operating in Groundwater and Hot-Humid Climate

Authors: César Ramírez-Dolores, Jorge Wong-Loya, Jorge Andaverde, Caleb Becerra

Abstract:

Low-depth geothermal energy can take advantage of the use of the subsoil as an air conditioning technique, being used as a passive system or coupled to an active cooling and/or heating system. This source of air conditioning is possible because at a depth less than 10 meters, the subsoil temperature is practically homogeneous and tends to be constant regardless of the climatic conditions on the surface. The effect of temperature fluctuations on the soil surface decreases as depth increases due to the thermal inertia of the soil, causing temperature stability; this effect presents several advantages in the context of sustainable energy use. In the present work, the thermal behavior of a horizontal Air-Water Heat Exchanger (AWHE) is evaluated, and the thermal effectiveness and temperature of the air at the outlet of the prototype immersed in groundwater is experimentally determined. The thermohydraulic aspects of the heat exchanger were evaluated using the Number of Transfer Units-Efficiency (NTU-ε) method under conditions of groundwater flow in a coastal region of sandy soil (southeastern Mexico) and air flow induced by a blower, the system was constructed of polyvinyl chloride (PVC) and sensors were placed in both the exchanger and the water to record temperature changes. The results of this study indicate that when the exchanger operates in groundwater, it shows high thermal gains allowing better heat transfer, therefore, it significantly reduces the air temperature at the outlet of the system, which increases the thermal effectiveness of the system in values > 80%, this passive technique is relevant for building cooling applications and could represent a significant development in terms of thermal comfort for hot locations in emerging economy countries.

Keywords: convection, earth, geothermal energy, thermal comfort

Procedia PDF Downloads 78
7563 Slope Stability and Landslides Hazard Analysis, Limitations of Existing Approaches, and a New Direction

Authors: Alisawi Alaa T., Collins P. E. F.

Abstract:

The analysis and evaluation of slope stability and landslide hazards are landslide hazards are critically important in civil engineering projects and broader considerations of safety. The level of slope stability risk should be identified due to its significant and direct financial and safety effects. Slope stability hazard analysis is performed considering static and/or dynamic loading circumstances. To reduce and/or prevent the failure hazard caused by landslides, a sophisticated and practical hazard analysis method using advanced constitutive modeling should be developed and linked to an effective solution that corresponds to the specific type of slope stability and landslides failure risk. Previous studies on slope stability analysis methods identify the failure mechanism and its corresponding solution. The commonly used approaches include used approaches include limit equilibrium methods, empirical approaches for rock slopes (e.g., slope mass rating and Q-slope), finite element or finite difference methods, and district element codes. This study presents an overview and evaluation of these analysis techniques. Contemporary source materials are used to examine these various methods on the basis of hypotheses, the factor of safety estimation, soil types, load conditions, and analysis conditions and limitations. Limit equilibrium methods play a key role in assessing the level of slope stability hazard. The slope stability safety level can be defined by identifying the equilibrium of the shear stress and shear strength. The slope is considered stable when the movement resistance forces are greater than those that drive the movement with a factor of safety (ratio of the resistance of the resistance of the driving forces) that is greater than 1.00. However, popular and practical methods, including limit equilibrium approaches, are not effective when the slope experiences complex failure mechanisms, such as progressive failure, liquefaction, internal deformation, or creep. The present study represents the first episode of an ongoing project that involves the identification of the types of landslides hazards, assessment of the level of slope stability hazard, development of a sophisticated and practical hazard analysis method, linkage of the failure type of specific landslides conditions to the appropriate solution and application of an advanced computational method for mapping the slope stability properties in the United Kingdom, and elsewhere through geographical information system (GIS) and inverse distance weighted spatial interpolation(IDW) technique. This study investigates and assesses the different assesses the different analysis and solution techniques to enhance the knowledge on the mechanism of slope stability and landslides hazard analysis and determine the available solutions for each potential landslide failure risk.

Keywords: slope stability, finite element analysis, hazard analysis, landslides hazard

Procedia PDF Downloads 104
7562 Undoped and Fluorine Doped Zinc Oxide (ZnO:F) Thin Films Deposited by Ultrasonic Chemical Spray: Effect of the Solution on the Electrical and Optical Properties

Authors: E. Chávez-Vargas, M. de la L. Olvera-Amador, A. Jimenez-Gonzalez, A. Maldonado

Abstract:

Undoped and fluorine doped zinc oxide (ZnO) thin films were deposited on sodocalcic glass substrates by the ultrasonic chemical spray technique. As the main goal is the manufacturing of transparent electrodes, the effects of both the solution composition and the substrate temperature on both the electrical and optical properties of ZnO thin films were studied. As a matter of fact, the effect of fluorine concentration ([F]/[F+Zn] at. %), solvent composition (acetic acid, water, methanol ratios) and ageing time, regarding solution composition, were varied. In addition, the substrate temperature and the deposition time, regarding the chemical spray technique, were also varied. Structural studies confirm the deposition of polycrystalline, hexagonal, wurtzite type, ZnO. The results show that the increase of ([F]/[F+Zn] at. %) ratio in the solution, decreases the sheet resistance, RS, of the ZnO:F films, reaching a minimum, in the order of 1.6 Ωcm, at 60 at. %; further increase in the ([F]/[F+Zn]) ratio increases the RS of the films. The same trend occurs with the variation in substrate temperature, as a minimum RS of ZnO:F thin films was encountered when deposited at TS= 450 °C. ZnO:F thin films deposited with aged solution show a significant decrease in the RS in the order of 100 ΩS. The transmittance of the films was also favorable affected by the solvent ratio and, more significantly, by the ageing of the solution. The whole evaluation of optical and electrical characteristics of the ZnO:F thin films deposited under different conditions, was done under Haacke’s figure of Merit in order to have a clear and quantitative trend as transparent conductors application.

Keywords: zinc oxide, ZnO:F, TCO, Haacke’s figure of Merit

Procedia PDF Downloads 317
7561 Modeling and Optimization of Micro-Grid Using Genetic Algorithm

Authors: Mehrdad Rezaei, Reza Haghmaram, Nima Amjadi

Abstract:

This paper proposes an operating and cost optimization model for micro-grid (MG). This model takes into account emission costs of NOx, SO2, and CO2, together with the operation and maintenance costs. Wind turbines (WT), photovoltaic (PV) arrays, micro turbines (MT), fuel cells (FC), diesel engine generators (DEG) with different capacities are considered in this model. The aim of the optimization is minimizing operation cost according to constraints, supply demand and safety of the system. The proposed genetic algorithm (GA), with the ability to fine-tune its own settings, is used to optimize the micro-grid operation.

Keywords: micro-grid, optimization, genetic algorithm, MG

Procedia PDF Downloads 516
7560 An Analysis of the Panel’s Perceptions on Cooking in “Metaverse Kitchen”

Authors: Minsun Kim

Abstract:

This study uses the concepts of augmented reality, virtual reality, mirror world, and lifelogging to describe “Metaverse Kitchen” that can be defined as a space in the virtual world where users can cook the dishes they want using the meal kit regardless of location or time. This study examined expert’s perceptions of cooking and food delivery services using "Metaverse Kitchen." In this study, a consensus opinion on the concept, potential pros, and cons of "Metaverse Kitchen" was derived from 20 culinary experts through the Delphi technique. The three Delphi rounds were conducted for one month, from December 2022 to January 2023. The results are as follows. First, users select and cook food after visiting the "Metaverse Kitchen" in the virtual space. Second, when a user cooks in "Metaverse Kitchen" in AR or VR, the information is transmitted to nearby restaurants. Third, the platform operating the "Metaverse Kitchen" assigns the order to the restaurant that can provide the meal kit cooked by the user in the virtual space first in the same way among these restaurants. Fourth, the user pays for the "Metaverse Kitchen", and the restaurant delivers the cooked meal kit to the user and then receives payment for the user's meal and delivery fee from the platform. Fifth, the platform company that operates the mirror world "Metaverse Kitchen" uses lifelogging to manage customers. They receive commissions from users and affiliated restaurants and operate virtual restaurant businesses using meal kits. Among the selection attributes for meal kits provided in "Metaverse Kitchen", the panelists suggested convenience, quality, and reliability as advantages and predicted relatively high price as a disadvantage. "Metaverse Kitchen" using meal kits is expected to form a new food supply system in the future society. In follow-up studies, an empirical analysis is required targeting producers and consumers.

Keywords: metaverse, meal kits, Delphi technique, Metaverse Kitchen

Procedia PDF Downloads 225
7559 Cognitive Models of Future in Political Texts

Authors: Solopova Olga

Abstract:

The present paper briefly recalls theoretical preconditions for investigating cognitive-discursive models of future in political discourse. The author reviews theories and methods used for strengthening a future focus in this discourse working out two main tools – a model of future and a metaphorical scenario. The paper examines the implications of metaphorical analogies for modeling future in mass media. It argues that metaphor is not merely a rhetorical ornament in the political discourse of media regulation but a conceptual model that legislates and regulates our understanding of future.

Keywords: cognitive approach, future research, political discourse, model, scenario, metaphor

Procedia PDF Downloads 400
7558 Robust Method for Evaluation of Catchment Response to Rainfall Variations Using Vegetation Indices and Surface Temperature

Authors: Revalin Herdianto

Abstract:

Recent climate changes increase uncertainties in vegetation conditions such as health and biomass globally and locally. The detection is, however, difficult due to the spatial and temporal scale of vegetation coverage. Due to unique vegetation response to its environmental conditions such as water availability, the interplay between vegetation dynamics and hydrologic conditions leave a signature in their feedback relationship. Vegetation indices (VI) depict vegetation biomass and photosynthetic capacity that indicate vegetation dynamics as a response to variables including hydrologic conditions and microclimate factors such as rainfall characteristics and land surface temperature (LST). It is hypothesized that the signature may be depicted by VI in its relationship with other variables. To study this signature, several catchments in Asia, Australia, and Indonesia were analysed to assess the variations in hydrologic characteristics with vegetation types. Methods used in this study includes geographic identification and pixel marking for studied catchments, analysing time series of VI and LST of the marked pixels, smoothing technique using Savitzky-Golay filter, which is effective for large area and extensive data. Time series of VI, LST, and rainfall from satellite and ground stations coupled with digital elevation models were analysed and presented. This study found that the hydrologic response of vegetation to rainfall variations may be shown in one hydrologic year, in which a drought event can be detected a year later as a suppressed growth. However, an annual rainfall of above average do not promote growth above average as shown by VI. This technique is found to be a robust and tractable approach for assessing catchment dynamics in changing climates.

Keywords: vegetation indices, land surface temperature, vegetation dynamics, catchment

Procedia PDF Downloads 289
7557 Enhanced Optical Nonlinearity in Bismuth Borate Glass: Effect of Size of Nanoparticles

Authors: Shivani Singla, Om Prakash Pandey, Gopi Sharma

Abstract:

Metallic nanoparticle doped glasses has lead to rapid development in the field of optics. Large third order non-linearity, ultrafast time response, and a wide range of resonant absorption frequencies make these metallic nanoparticles more important in comparison to their bulk material. All these properties are highly dependent upon the size, shape, and surrounding environment of the nanoparticles. In a quest to find a suitable material for optical applications, several efforts have been devoted to improve the properties of such glasses in the past. In the present study, bismuth borate glass doped with different size gold nanoparticles (AuNPs) has been prepared using the conventional melt-quench technique. Synthesized glasses are characterized by X-ray diffraction (XRD) and Fourier Transformation Infrared spectroscopy (FTIR) to observe the structural modification in the glassy matrix with the variation in the size of the AuNPs. Glasses remain purely amorphous in nature even after the addition of AuNPs, whereas FTIR proposes that the main structure contains BO₃ and BO₄ units. Field emission scanning electron microscopy (FESEM) confirms the existence and variation in the size of AuNPs. Differential thermal analysis (DTA) depicts that prepared glasses are thermally stable and are highly suitable for the fabrication of optical fibers. The nonlinear optical parameters (nonlinear absorption coefficient and nonlinear refractive index) are calculated out by using the Z-scan technique with a Ti: sapphire laser at 800 nm. It has been concluded that the size of the nanoparticles highly influences the structural thermal and optical properties system.

Keywords: bismuth borate glass, different size, gold nanoparticles, nonlinearity

Procedia PDF Downloads 127
7556 Online Monitoring of Airborne Bioaerosols Released from a Composting, Green Waste Site

Authors: John Sodeau, David O'Connor, Shane Daly, Stig Hellebust

Abstract:

This study is the first to employ the online WIBS (Waveband Integrated Biosensor Sensor) technique for the monitoring of bioaerosol emissions and non-fluorescing “dust” released from a composting/green waste site. The purpose of the research was to provide a “proof of principle” for using WIBS to monitor such a location continually over days and nights in order to construct comparative “bioaerosol site profiles”. Current impaction/culturing methods take many days to achieve results available by the WIBS technique in seconds.The real-time data obtained was then used to assess variations of the bioaerosol counts as a function of size, “shape”, site location, working activity levels, time of day, relative humidity, wind speeds and wind directions. Three short campaigns were undertaken, one classified as a “light” workload period, another as a “heavy” workload period and finally a weekend when the site was closed. One main bioaerosol size regime was found to predominate: 0.5 micron to 3 micron with morphologies ranging from elongated to elipsoidal/spherical. The real-time number-concentration data were consistent with an Andersen sampling protocol that was employed at the site. The number-concentrations of fluorescent particles as a proportion of total particles counted amounted, on average, to ~1% for the “light” workday period, ~7% for the “heavy” workday period and ~18% for the weekend. The bioaerosol release profiles at the weekend were considerably different from those monitored during the working weekdays.

Keywords: bioaerosols, composting, fluorescence, particle counting in real-time

Procedia PDF Downloads 359