Search results for: multiple scales method
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 22991

Search results for: multiple scales method

18851 The Electric Car Wheel Hub Motor Work Analysis with the Use of 2D FEM Electromagnetic Method and 3D CFD Thermal Simulations

Authors: Piotr Dukalski, Bartlomiej Bedkowski, Tomasz Jarek, Tomasz Wolnik

Abstract:

The article is concerned with the design of an electric in wheel hub motor installed in an electric car with two-wheel drive. It presents the construction of the motor on the 3D cross-section model. Work simulation of the motor (applicated to Fiat Panda car) and selected driving parameters such as driving on the road with a slope of 20%, driving at maximum speed, maximum acceleration of the car from 0 to 100 km/h are considered by the authors in the article. The demand for the drive power taking into account the resistance to movement was determined for selected driving conditions. The parameters of the motor operation and the power losses in its individual elements, calculated using the FEM 2D method, are presented for the selected car driving parameters. The calculated power losses are used in 3D models for thermal calculations using the CFD method. Detailed construction of thermal models with materials data, boundary conditions and losses calculated using the FEM 2D method are presented in the article. The article presents and describes calculated temperature distributions in individual motor components such as winding, permanent magnets, magnetic core, body, cooling system components. Generated losses in individual motor components and their impact on the limitation of its operating parameters are described by authors. Attention is paid to the losses generated in permanent magnets, which are a source of heat as the removal of which from inside the motor is difficult. Presented results of calculations show how individual motor power losses, generated in different load conditions while driving, affect its thermal state.

Keywords: electric car, electric drive, electric motor, thermal calculations, wheel hub motor

Procedia PDF Downloads 169
18850 The Martingale Options Price Valuation for European Puts Using Stochastic Differential Equation Models

Authors: H. C. Chinwenyi, H. D. Ibrahim, F. A. Ahmed

Abstract:

In modern financial mathematics, valuing derivatives such as options is often a tedious task. This is simply because their fair and correct prices in the future are often probabilistic. This paper examines three different Stochastic Differential Equation (SDE) models in finance; the Constant Elasticity of Variance (CEV) model, the Balck-Karasinski model, and the Heston model. The various Martingales option price valuation formulas for these three models were obtained using the replicating portfolio method. Also, the numerical solution of the derived Martingales options price valuation equations for the SDEs models was carried out using the Monte Carlo method which was implemented using MATLAB. Furthermore, results from the numerical examples using published data from the Nigeria Stock Exchange (NSE), all share index data show the effect of increase in the underlying asset value (stock price) on the value of the European Put Option for these models. From the results obtained, we see that an increase in the stock price yields a decrease in the value of the European put option price. Hence, this guides the option holder in making a quality decision by not exercising his right on the option.

Keywords: equivalent martingale measure, European put option, girsanov theorem, martingales, monte carlo method, option price valuation formula

Procedia PDF Downloads 127
18849 Clinical Nursing Experience in Managing a Uterine Cancer Patient with Psychogenic Shock During the Extracorporeal Membrane Oxygenation Weaning Process

Authors: Syue-Wen Lin

Abstract:

Objective: This article discusses the nursing experience of caring for a uterine cancer patient who experienced cardiogenic shock and was weaned off ECMO. The patient was placed on ECMO due to cardiogenic shock and initially struggled with anxiety caused by the physical discomfort from the disease and multiple medical devices, as well as the isolation in the ICU and restrictions on physical activity. Over time, the patient was able to wean off ECMO and perform daily activities and rehabilitation independently. Methods: The nursing period was from January 6 to January 9. Through observation, direct care, interviews, physical assessments, and case reviews, the intensive care team and bypass personnel conducted a comprehensive assessment using Gordon's 11 functional health patterns. The assessment identified three main nursing health problems: pain, anxiety, and decreased cardiac tissue perfusion. Results: The author consulted a psychologist to employ open communication techniques and empathetic care to build a trusting nurse-patient relationship. A patient-centered intensive cancer care plan was developed. Pain was assessed using a pain scale, and pain medications were adjusted in consultation with a pharmacist. Lavender essential oil therapy, light music, and pillows were used to distract and alleviate pain. The patient was encouraged to express feelings and family members were invited to increase visits and provide companionship to reduce the uncertainty caused by cancer and illness. Vital signs were closely monitored, and nursing interventions were provided to maintain adequate myocardial perfusion. Post-ECMO, the patient was encouraged to engage in rehabilitation and cardiopulmonary training. Conclusion: A key takeaway from the care process is the importance of observing not only the patient's vital signs but also their psychological state, especially when dealing with cancer patients on ECMO. The patient's greatest source of comfort was the presence of family, which helped alleviate anxiety. Healthcare providers play multiple critical roles as advocates, coordinators, educators, and counselors, listening to and accepting the patient’s emotional responses. The report aims to provide clinical cancer nurses with a reference to improve the quality of care and alleviate cancer-related discomfort.

Keywords: ECMO, uterine cancer, palliative care, Gordon's 11 functional health patterns

Procedia PDF Downloads 16
18848 Modelling Dengue Disease With Climate Variables Using Geospatial Data For Mekong River Delta Region of Vietnam

Authors: Thi Thanh Nga Pham, Damien Philippon, Alexis Drogoul, Thi Thu Thuy Nguyen, Tien Cong Nguyen

Abstract:

Mekong River Delta region of Vietnam is recognized as one of the most vulnerable to climate change due to flooding and seawater rise and therefore an increased burden of climate change-related diseases. Changes in temperature and precipitation are likely to alter the incidence and distribution of vector-borne diseases such as dengue fever. In this region, the peak of the dengue epidemic period is around July to September during the rainy season. It is believed that climate is an important factor for dengue transmission. This study aims to enhance the capacity of dengue prediction by the relationship of dengue incidences with climate and environmental variables for Mekong River Delta of Vietnam during 2005-2015. Mathematical models for vector-host infectious disease, including larva, mosquito, and human being were used to calculate the impacts of climate to the dengue transmission with incorporating geospatial data for model input. Monthly dengue incidence data were collected at provincial level. Precipitation data were extracted from satellite observations of GSMaP (Global Satellite Mapping of Precipitation), land surface temperature and land cover data were from MODIS. The value of seasonal reproduction number was estimated to evaluate the potential, severity and persistence of dengue infection, while the final infected number was derived to check the outbreak of dengue. The result shows that the dengue infection depends on the seasonal variation of climate variables with the peak during the rainy season and predicted dengue incidence follows well with this dynamic for the whole studied region. However, the highest outbreak of 2007 dengue was not captured by the model reflecting nonlinear dependences of transmission on climate. Other possible effects will be discussed to address the limitation of the model. This suggested the need of considering of both climate variables and another variability across temporal and spatial scales.

Keywords: infectious disease, dengue, geospatial data, climate

Procedia PDF Downloads 378
18847 Enhancing Large Language Models' Data Analysis Capability with Planning-and-Execution and Code Generation Agents: A Use Case for Southeast Asia Real Estate Market Analytics

Authors: Kien Vu, Jien Min Soh, Mohamed Jahangir Abubacker, Piyawut Pattamanon, Soojin Lee, Suvro Banerjee

Abstract:

Recent advances in Generative Artificial Intelligence (GenAI), in particular Large Language Models (LLMs) have shown promise to disrupt multiple industries at scale. However, LLMs also present unique challenges, notably, these so-called "hallucination" which is the generation of outputs that are not grounded in the input data that hinders its adoption into production. Common practice to mitigate hallucination problem is utilizing Retrieval Agmented Generation (RAG) system to ground LLMs'response to ground truth. RAG converts the grounding documents into embeddings, retrieve the relevant parts with vector similarity between user's query and documents, then generates a response that is not only based on its pre-trained knowledge but also on the specific information from the retrieved documents. However, the RAG system is not suitable for tabular data and subsequent data analysis tasks due to multiple reasons such as information loss, data format, and retrieval mechanism. In this study, we have explored a novel methodology that combines planning-and-execution and code generation agents to enhance LLMs' data analysis capabilities. The approach enables LLMs to autonomously dissect a complex analytical task into simpler sub-tasks and requirements, then convert them into executable segments of code. In the final step, it generates the complete response from output of the executed code. When deployed beta version on DataSense, the property insight tool of PropertyGuru, the approach yielded promising results, as it was able to provide market insights and data visualization needs with high accuracy and extensive coverage by abstracting the complexities for real-estate agents and developers from non-programming background. In essence, the methodology not only refines the analytical process but also serves as a strategic tool for real estate professionals, aiding in market understanding and enhancement without the need for programming skills. The implication extends beyond immediate analytics, paving the way for a new era in the real estate industry characterized by efficiency and advanced data utilization.

Keywords: large language model, reasoning, planning and execution, code generation, natural language processing, prompt engineering, data analysis, real estate, data sense, PropertyGuru

Procedia PDF Downloads 82
18846 Comparison of the Amount of Microplastics in Plant- And Animal-Based Milks

Authors: Meli̇sa Aşci, Berk Kiliç, Emine Ulusoy

Abstract:

Ingestion of microplastics in humans has been increasing rapidly, as such hazardous materials are abundant in multiple food products, specifically milks. With increasing consumption rates, humans have been ingesting microplastics on a daily basis, making them prone to be intoxicated and even cause the disruption of intracellular pathways and liver cell disruption, and eventually tissue and organ damage. In this experiment, different milk types(animal-based and plant-based) were tested for microplastics. Results showed that animal-based milks contained a higher concentration of microplastics compared to plant-based milks. Research has shown that in addition to causing health issues in humans, microplastics can also affect livestock animals and plants.

Keywords: microplastics, plant-based milks, animal-based milks, preventive nutrition

Procedia PDF Downloads 20
18845 Compartmental Model Approach for Dosimetric Calculations of ¹⁷⁷Lu-DOTATOC in Adenocarcinoma Breast Cancer Based on Animal Data

Authors: M. S. Mousavi-Daramoroudi, H. Yousefnia, S. Zolghadri, F. Abbasi-Davani

Abstract:

Dosimetry is an indispensable and precious factor in patient treatment planning; to minimize the absorbed dose in vital tissues. In this study, In accordance with the proper characteristics of DOTATOC and ¹⁷⁷Lu, after preparing ¹⁷⁷Lu-DOTATOC at the optimal conditions for the first time in Iran, radionuclidic and radiochemical purity of the solution was investigated using an HPGe spectrometer and ITLC method, respectively. The biodistribution of the compound was assayed for treatment of adenocarcinoma breast cancer in bearing BALB/c mice. The results have demonstrated that ¹⁷⁷Lu-DOTATOC is a profitable selection for therapy of the tumors. Because of the vital role of internal dosimetry before and during therapy, the effort to improve the accuracy and rapidity of dosimetric calculations is necessary. For this reason, a new method was accomplished to calculate the absorbed dose through mixing between compartmental model, animal dosimetry and extrapolated data from animal to human and using MIRD method. Despite utilization of compartmental model based on the experimental data, it seems this approach may increase the accuracy of dosimetric data, confidently.

Keywords: ¹⁷⁷Lu-DOTATOC, biodistribution modeling, compartmental model, internal dosimetry

Procedia PDF Downloads 214
18844 Aerogel Fabrication Via Modified Rapid Supercritical Extraction (RSCE) Process - Needle Valve Pressure Release

Authors: Haibo Zhao, Thomas Andre, Katherine Avery, Alper Kiziltas, Deborah Mielewski

Abstract:

Silica aerogels were fabricated through a modified rapid supercritical extraction (RSCE) process. The silica aerogels were made using a tetramethyl orthosilicate precursor and then placed in a hot press and brought to the supercritical point of the solvent, ethanol. In order to control the pressure release without a pressure controller, a needle valve was used. The resulting aerogels were then characterized for their physical and chemical properties and compared to silica aerogels created using similar methods. The aerogels fabricated using this modified RSCE method were found to have similar properties to those in other papers using the unmodified RSCE method. Silica aerogel infused glass blanket composite, graphene reinforced silica aerogel composite were also successfully fabricated by this new method. The modified RSCE process and system is a prototype for better gas outflow control with a lower cost of equipment setup. Potentially, this process could be evolved to a continuous low-cost high-volume production process to meet automotive requirements.

Keywords: aerogel, automotive, rapid supercritical extraction process, low cost production

Procedia PDF Downloads 178
18843 Use of the Gas Chromatography Method for Hydrocarbons' Quality Evaluation in the Offshore Fields of the Baltic Sea

Authors: Pavel Shcherban, Vlad Golovanov

Abstract:

Currently, there is an active geological exploration and development of the subsoil shelf of the Kaliningrad region. To carry out a comprehensive and accurate assessment of the volumes and degree of extraction of hydrocarbons from open deposits, it is necessary to establish not only a number of geological and lithological characteristics of the structures under study, but also to determine the oil quality, its viscosity, density, fractional composition as accurately as possible. In terms of considered works, gas chromatography is one of the most capacious methods that allow the rapid formation of a significant amount of initial data. The aspects of the application of the gas chromatography method for determining the chemical characteristics of the hydrocarbons of the Kaliningrad shelf fields are observed in the article, as well as the correlation-regression analysis of these parameters in comparison with the previously obtained chemical characteristics of hydrocarbon deposits located on the land of the region. In the process of research, a number of methods of mathematical statistics and computer processing of large data sets have been applied, which makes it possible to evaluate the identity of the deposits, to specify the amount of reserves and to make a number of assumptions about the genesis of the hydrocarbons under analysis.

Keywords: computer processing of large databases, correlation-regression analysis, hydrocarbon deposits, method of gas chromatography

Procedia PDF Downloads 152
18842 Virtual Player for Learning by Observation to Assist Karate Training

Authors: Kazumoto Tanaka

Abstract:

It is well known that sport skill learning is facilitated by video observation of players’ actions in sports. The optimal viewpoint for the observation of actions depends on sport scenes. On the other hand, it is impossible to change viewpoint for the observation in general, because most videos are filmed from fixed points. The study has tackled the problem and focused on karate match as a first step. The study developed a method for observing karate player’s actions from any point of view by using 3D-CG model (i.e. virtual player) obtained from video images, and verified the effectiveness of the method on karate match.

Keywords: computer graphics, karate training, learning by observation, motion capture, virtual player

Procedia PDF Downloads 270
18841 Energy Efficiency Approach to Reduce Costs of Ownership of Air Jet Weaving

Authors: Corrado Grassi, Achim Schröter, Yves Gloy, Thomas Gries

Abstract:

Air jet weaving is the most productive, but also the most energy consuming weaving method. Increasing energy costs and environmental impact are constantly a challenge for the manufacturers of weaving machines. Current technological developments concern with low energy costs, low environmental impact, high productivity, and constant product quality. The high degree of energy consumption of the method can be ascribed to the high need of compressed air. An energy efficiency method is applied to the air jet weaving technology. Such method identifies and classifies the main relevant energy consumers and processes from the exergy point of view and it leads to the identification of energy efficiency potentials during the weft insertion process. Starting from the design phase, energy efficiency is considered as the central requirement to be satisfied. The initial phase of the method consists of an analysis of the state of the art of the main weft insertion components in order to point out a prioritization of the high demanding energy components and processes. The identified major components are investigated to reduce the high demand of energy of the weft insertion process. During the interaction of the flow field coming from the relay nozzles within the profiled reed, only a minor part of the stream is really accelerating the weft yarn, hence resulting in large energy inefficiency. Different tools such as FEM analysis, CFD simulation models and experimental analysis are used in order to design a more energy efficient design of the involved components in the filling insertion. A different concept for the metal strip of the profiled reed is developed. The developed metal strip allows a reduction of the machine energy consumption. Based on a parametric and aerodynamic study, the designed reed transmits higher values of the flow power to the filling yarn. The innovative reed fulfills both the requirement of raising energy efficiency and the compliance with the weaving constraints.

Keywords: air jet weaving, aerodynamic simulation, energy efficiency, experimental validation, weft insertion

Procedia PDF Downloads 190
18840 An Optimization Tool-Based Design Strategy Applied to Divide-by-2 Circuits with Unbalanced Loads

Authors: Agord M. Pinto Jr., Yuzo Iano, Leandro T. Manera, Raphael R. N. Souza

Abstract:

This paper describes an optimization tool-based design strategy for a Current Mode Logic CML divide-by-2 circuit. Representing a building block for output frequency generation in a RFID protocol based-frequency synthesizer, the circuit was designed to minimize the power consumption for driving of multiple loads with unbalancing (at transceiver level). Implemented with XFAB XC08 180 nm technology, the circuit was optimized through MunEDA WiCkeD tool at Cadence Virtuoso Analog Design Environment ADE.

Keywords: divide-by-2 circuit, CMOS technology, PLL phase locked-loop, optimization tool, CML current mode logic, RF transceiver

Procedia PDF Downloads 458
18839 Optimisation of the Input Layer Structure for Feedforward Narx Neural Networks

Authors: Zongyan Li, Matt Best

Abstract:

This paper presents an optimization method for reducing the number of input channels and the complexity of the feed-forward NARX neural network (NN) without compromising the accuracy of the NN model. By utilizing the correlation analysis method, the most significant regressors are selected to form the input layer of the NN structure. An application of vehicle dynamic model identification is also presented in this paper to demonstrate the optimization technique and the optimal input layer structure and the optimal number of neurons for the neural network is investigated.

Keywords: correlation analysis, F-ratio, levenberg-marquardt, MSE, NARX, neural network, optimisation

Procedia PDF Downloads 367
18838 The Studies of the Impact of Biomimicry and Sustainability on Urban Design

Authors: Nourhane Mohamed El Haridi, Mostafa El Arabi, Zeyad El Sayad

Abstract:

Biomimicry is defined, by Benyus the natural sciences writer, as imitating or taking inspiration from nature’s forms and processes to solve human problems. Biomimicry is the conscious emulation of life’s genius. As the design community realizes the tremendous impact human constructions have on the world, environmental designers look to new approaches like biomimicry to advance sustainable design. Building leading the declaration made by biomimicry scientists that a full imitation of nature engages form, ecosystem, and process; this paper uses a logic approach to interpret human and environmental wholeness. Designers would benefit from both integrating social theory with environmental thinking and from combining their substantive skills with techniques for getting sustainable biomimic urban design. Integrating biomimicryʹs “Life’s Principles” into a built environment process model will make biomimicry more accessible and thus more widely accepted throughout the industry, and the sustainability of all species will benefit. The Biomimicry Guild hypothesizes the incorporation of these principles, called Lifeʹs Principles, increase the likelihood of sustainability for a respective design, and make it more likely that the design will have a greater impact on sustainability for future generations of all species as mentioned by Benyus in her book. This thesis utilizes Life’s Principles as a foundation for a design process model intended for application on built environment projects at various scales. This paper takes a look at the importance of the integration of biomimicry in urban design to get more sustainable cities and better life, by analyzing the principles of both sustainability and biomimicry, and applying these ideas on futuristic or existing cities to make a biomimic sustainable city more healthier and more conductive to life, and get a better biomimic urban design. A group of experts, architects, biologists, scientists, economists and ecologists should work together to face all the financial and designing difficulties, to have better solutions and good innovative ideas for biomimic sustainable urban design, it is not the only solution, but it is one of the best studies for a better future.

Keywords: biomimicry, built environment, sustainability, urban design

Procedia PDF Downloads 518
18837 Bayesian Value at Risk Forecast Using Realized Conditional Autoregressive Expectiel Mdodel with an Application of Cryptocurrency

Authors: Niya Chen, Jennifer Chan

Abstract:

In the financial market, risk management helps to minimize potential loss and maximize profit. There are two ways to assess risks; the first way is to calculate the risk directly based on the volatility. The most common risk measurements are Value at Risk (VaR), sharp ratio, and beta. Alternatively, we could look at the quantile of the return to assess the risk. Popular return models such as GARCH and stochastic volatility (SV) focus on modeling the mean of the return distribution via capturing the volatility dynamics; however, the quantile/expectile method will give us an idea of the distribution with the extreme return value. It will allow us to forecast VaR using return which is direct information. The advantage of using these non-parametric methods is that it is not bounded by the distribution assumptions from the parametric method. But the difference between them is that expectile uses a second-order loss function while quantile regression uses a first-order loss function. We consider several quantile functions, different volatility measures, and estimates from some volatility models. To estimate the expectile of the model, we use Realized Conditional Autoregressive Expectile (CARE) model with the bayesian method to achieve this. We would like to see if our proposed models outperform existing models in cryptocurrency, and we will test it by using Bitcoin mainly as well as Ethereum.

Keywords: expectile, CARE Model, CARR Model, quantile, cryptocurrency, Value at Risk

Procedia PDF Downloads 105
18836 Deformulation and Comparative Analysis of Apparently Similar Polymers Using Multiple Modes of Pyrolysis-Gc/Ms

Authors: Athena Nguyen, Rojin Belganeh

Abstract:

Detecting and identifying differences in like polymer materials are key factors in deformulation, comparative analysis as well as reverse engineering. Pyrolysis-GC/MS is an easy solid sample introduction technique which expands the application areas of gas chromatography and mass spectrometry. The Micro-furnace pyrolyzer is directly interfaced with the GC injector preventing any potential of cold spot, carryover, and cross contamination. This presentation demonstrates the study of two similar polymers by performing different mode of operations in the same system: Evolve gas analysis (EGA), Flash pyrolysis, Thermal desorption analysis, and Heart-cutting analysis. Unknown polymer materials and their chemical compositions are identified.

Keywords: gas chromatography/mass spectrometry, pyrolysis, pyrolyzer, thermal desorption-GC/MS

Procedia PDF Downloads 257
18835 Big Data Analysis with RHadoop

Authors: Ji Eun Shin, Byung Ho Jung, Dong Hoon Lim

Abstract:

It is almost impossible to store or analyze big data increasing exponentially with traditional technologies. Hadoop is a new technology to make that possible. R programming language is by far the most popular statistical tool for big data analysis based on distributed processing with Hadoop technology. With RHadoop that integrates R and Hadoop environment, we implemented parallel multiple regression analysis with different sizes of actual data. Experimental results showed our RHadoop system was much faster as the number of data nodes increases. We also compared the performance of our RHadoop with lm function and big lm packages available on big memory. The results showed that our RHadoop was faster than other packages owing to paralleling processing with increasing the number of map tasks as the size of data increases.

Keywords: big data, Hadoop, parallel regression analysis, R, RHadoop

Procedia PDF Downloads 430
18834 Energy Saving Study of Mass Rapid Transit by Optimal Train Coasting Operation

Authors: Artiya Sopharak, Tosaphol Ratniyomchai, Thanatchai Kulworawanichpong

Abstract:

This paper presents an energy-saving study of Mass Rapid Transit (MRT) using an optimal train coasting operation. For the dynamic train movement with four modes of operation, including accelerating mode, constant speed or cruising mode, coasting mode, and braking mode are considered in this study. The acceleration rate, the deceleration rate, and the starting coasting point are taken into account the optimal train speed profile during coasting mode with considering the energy saving and acceptable travel time comparison to the based case with no coasting operation. In this study, the mathematical method as a Quadratic Search Method (QDS) is conducted to carry out the optimization problem. A single train of MRT services between two stations with a distance of 2 km and a maximum speed of 80 km/h is taken to be the case study. Regarding the coasting mode operation, the results show that the longer distance of costing mode, the less energy consumption in cruising mode and the less braking energy. On the other hand, the shorter distance of coasting mode, the more energy consumption in cruising mode and the more braking energy.

Keywords: energy saving, coasting mode, mass rapid transit, quadratic search method

Procedia PDF Downloads 296
18833 Templating Copper on Polymer/DNA Hybrid Nanowires

Authors: Mahdi Almaky, Reda Hassanin, Benjamin Horrocks, Andrew Houlton

Abstract:

DNA-templated poly(N-substituted pyrrole)bipyridinium nanowires were synthesised at room temperature using the chemical oxidation method. The resulting CPs/DNA hybrids have been characterised using electronic and vibrational spectroscopic methods especially Ultraviolet-Visible (UV-Vis) spectroscopy and FTIR spectroscpy. The nanowires morphology was characterised using Atomic Force Microscopy (AFM). The electrical properties of the prepared nanowires were characterised using Electrostatic Force Microscopy (EFM), and measured using conductive AFM (c-AFM) and two terminal I/V technique, where the temperature dependence of the conductivity was probed. The conductivities of the prepared CPs/DNA nanowires are generally lower than PPy/DNA nanowires showingthe large effect on N-alkylation in decreasing the conductivity of the polymer, butthese are higher than the conductivity of their corresponding bulk films.This enhancement in conductivity could be attributed to the ordering of the polymer chains on DNA during the templating process. The prepared CPs/DNA nanowires were used as templates for the growth of copper nanowires at room temperature using aqueous solution of Cu(NO3)2as a source of Cu2+ and ascorbic acid as reducing agent. AFM images showed that these nanowires were uniform and continuous compared to copper nanowires prepared using the templating method directly onto DNA. Electrical characterization of the nanowires by c AFM revealed slight improvement in conductivity of these nanowires (Cu-CPs/DNA) compared to CPs/DNA nanowires before metallisation.

Keywords: templating, copper nanowires, polymer/DNA hybrid, chemical oxidation method

Procedia PDF Downloads 357
18832 The Roman Fora in North Africa Towards a Supportive Protocol to the Decision for the Morphological Restitution

Authors: Dhouha Laribi Galalou, Najla Allani Bouhoula, Atef Hammouda

Abstract:

This research delves into the fundamental question of the morphological restitution of built archaeology in order to place it in its paradigmatic context and to seek answers to it. Indeed, the understanding of the object of the study, its analysis, and the methodology of solving the morphological problem posed, are manageable aspects only by means of a thoughtful strategy that draws on well-defined epistemological scaffolding. In this stream, the crisis of natural reasoning in archaeology has generated multiple changes in this field, ranging from the use of new tools to the integration of an archaeological information system where urbanization involves the interplay of several disciplines. The built archaeological topic is also an architectural and morphological object. It is also a set of articulated elementary data, the understanding of which is about to be approached from a logicist point of view. Morphological restitution is no exception to the rule, and the inter-exchange between the different disciplines uses the capacity of each to frame the reflection on the incomplete elements of a given architecture or on its different phases and multiple states of existence. The logicist sequence is furnished by the set of scattered or destroyed elements found, but also by what can be called a rule base which contains the set of rules for the architectural construction of the object. The knowledge base built from the archaeological literature also provides a reference that enters into the game of searching for forms and articulations. The choice of the Roman Forum in North Africa is justified by the great urban and architectural characteristics of this entity. The research on the forum involves both a fairly large knowledge base but also provides the researcher with material to study - from a morphological and architectural point of view - starting from the scale of the city down to the architectural detail. The experimentation of the knowledge deduced on the paradigmatic level, as well as the deduction of an analysis model, is then carried out on the basis of a well-defined context which contextualises the experimentation from the elaboration of the morphological information container attached to the rule base and the knowledge base. The use of logicist analysis and artificial intelligence has allowed us to first question the aspects already known in order to measure the credibility of our system, which remains above all a decision support tool for the morphological restitution of Roman Fora in North Africa. This paper presents a first experimentation of the model elaborated during this research, a model framed by a paradigmatic discussion and thus trying to position the research in relation to the existing paradigmatic and experimental knowledge on the issue.

Keywords: classical reasoning, logicist reasoning, archaeology, architecture, roman forum, morphology, calculation

Procedia PDF Downloads 141
18831 Estimating the Government Consumption and Investment Multipliers Using Local Projection Method on the US Data from 1966 to 2020

Authors: Mustofa Mahmud Al Mamun

Abstract:

Government spending, one of the major components of gross domestic product (GDP), is composed of government consumption, investment, and transfer payments. A change in government spending during recessionary periods can generate an increase in GDP greater than the increase in spending. This is called the "multiplier effect". Accurate estimation of government spending multiplier is important because fiscal policy has been used to stimulate a flagging economy. Many recent studies have focused on identifying parts of the economy that responds more to a stimulus under a variety of circumstances. This paper used the US dataset from 1966 to 2020 and local projection method assuming standard identification strategy to estimate the multipliers. The model includes important macroaggregates and controls for forecasted government spending, interest rate, consumer price index (CPI), export, import, and level of public debt. Investment multipliers are found to be positive and larger than the consumption multipliers. Consumption multipliers are either negative or not significantly different than zero. Results do not vary across the business cycle. However, the consumption multiplier estimated from pre-1980 data is positive.

Keywords: business cycle, consumption multipliers, forecasted government spending, investment multipliers, local projection method, zero lower bound

Procedia PDF Downloads 226
18830 Discrete Element Method Simulation of Crushable Pumice Sand

Authors: Sayed Hessam Bahmani, Rolsndo P. Orense

Abstract:

From an engineering point of view, pumice particles are problematic because of their crushability and compressibility due to their vesicular nature. Currently, information on the geotechnical characteristics of pumice sands is limited. While extensive empirical and laboratory tests can be implemented to characterize their behavior, these are generally time-consuming and expensive. These drawbacks have motivated attempts to study the effects of particle breakage of pumice sand through the Discrete Element Method (DEM). This method provides insights into the behavior of crushable granular material at both the micro and macro-level. In this paper, the results of single-particle crushing tests conducted in the laboratory are simulated using DEM through the open-source code YADE. This is done to better understand the parameters necessary to represent the pumice microstructure that governs its crushing features, and to examine how the resulting microstructure evolution affects a particle’s properties. The DEM particle model is then used to simulate the behavior of pumice sand during consolidated drained triaxial tests. The results indicate the importance of incorporating particle porosity and unique surface textures in the material characterization and show that interlocking between the crushed particles significantly influences the drained behavior of the pumice specimen.

Keywords: pumice sand, triaxial compression, simulation, particle breakage

Procedia PDF Downloads 240
18829 Investigating Best Practice Energy Efficiency Policies and Programs, and Their Replication Potential for Residential Sector of Saudi Arabia

Authors: Habib Alshuwaikhat, Nahid Hossain

Abstract:

Residential sector consumes more than half of the produced electricity in Saudi Arabia, and fossil fuel is the main source of energy to meet growing household electricity demand in the Kingdom. Several studies forecasted and expressed concern that unless the domestic energy demand growth is controlled, it will reduce Saudi Arabia’s crude oil export capacity within a decade and the Kingdom is likely to be incapable of exporting crude oil within next three decades. Though the Saudi government has initiated to address the domestic energy demand growth issue, the demand side energy management policies and programs are focused on industrial and commercial sectors. It is apparent that there is an urgent need to develop a comprehensive energy efficiency strategy for addressing efficient energy use in residential sector in the Kingdom. Then again as Saudi Arabia is at its primary stage in addressing energy efficiency issues in its residential sector, there is a scope for the Kingdom to learn from global energy efficiency practices and design its own energy efficiency policies and programs. However, in order to do that sustainable, it is essential to address local contexts of energy efficiency. It is also necessary to find out the policies and programs that will fit to the local contexts. Thus the objective of this study was set to identify globally best practice energy efficiency policies and programs in residential sector that have replication potential in Saudi Arabia. In this regard two sets of multi-criteria decision analysis matrices were developed to evaluate the energy efficiency policies and programs. The first matrix was used to evaluate the global energy efficiency policies and programs, and the second matrix was used to evaluate the replication potential of global best practice energy efficiency policies and programs for Saudi Arabia. Wuppertal Institute’s guidelines for energy efficiency policy evaluation were used to develop the matrices, and the different attributes of the matrices were set through available literature review. The study reveals that the best practice energy efficiency policies and programs with good replication potential for Saudi Arabia are those which have multiple components to address energy efficiency and are diversified in their characteristics. The study also indicates the more diversified components are included in a policy and program, the more replication potential it has for the Kingdom. This finding is consistent with other studies, where it is observed that in order to be successful in energy efficiency practices, it is required to introduce multiple policy components in a cluster rather than concentrate on a single policy measure. The developed multi-criteria decision analysis matrices for energy efficiency policy and program evaluation could be utilized to assess the replication potential of other globally best practice energy efficiency policies and programs for the residential sector of the Kingdom. In addition it has potential to guide Saudi policy makers to adopt and formulate its own energy efficiency policies and programs for Saudi Arabia.

Keywords: Saudi Arabia, residential sector, energy efficiency, policy evaluation

Procedia PDF Downloads 493
18828 Systematic Identification of Noncoding Cancer Driver Somatic Mutations

Authors: Zohar Manber, Ran Elkon

Abstract:

Accumulation of somatic mutations (SMs) in the genome is a major driving force of cancer development. Most SMs in the tumor's genome are functionally neutral; however, some cause damage to critical processes and provide the tumor with a selective growth advantage (termed cancer driver mutations). Current research on functional significance of SMs is mainly focused on finding alterations in protein coding sequences. However, the exome comprises only 3% of the human genome, and thus, SMs in the noncoding genome significantly outnumber those that map to protein-coding regions. Although our understanding of noncoding driver SMs is very rudimentary, it is likely that disruption of regulatory elements in the genome is an important, yet largely underexplored mechanism by which somatic mutations contribute to cancer development. The expression of most human genes is controlled by multiple enhancers, and therefore, it is conceivable that regulatory SMs are distributed across different enhancers of the same target gene. Yet, to date, most statistical searches for regulatory SMs have considered each regulatory element individually, which may reduce statistical power. The first challenge in considering the cumulative activity of all the enhancers of a gene as a single unit is to map enhancers to their target promoters. Such mapping defines for each gene its set of regulating enhancers (termed "set of regulatory elements" (SRE)). Considering multiple enhancers of each gene as one unit holds great promise for enhancing the identification of driver regulatory SMs. However, the success of this approach is greatly dependent on the availability of comprehensive and accurate enhancer-promoter (E-P) maps. To date, the discovery of driver regulatory SMs has been hindered by insufficient sample sizes and statistical analyses that often considered each regulatory element separately. In this study, we analyzed more than 2,500 whole-genome sequence (WGS) samples provided by The Cancer Genome Atlas (TCGA) and The International Cancer Genome Consortium (ICGC) in order to identify such driver regulatory SMs. Our analyses took into account the combinatorial aspect of gene regulation by considering all the enhancers that control the same target gene as one unit, based on E-P maps from three genomics resources. The identification of candidate driver noncoding SMs is based on their recurrence. We searched for SREs of genes that are "hotspots" for SMs (that is, they accumulate SMs at a significantly elevated rate). To test the statistical significance of recurrence of SMs within a gene's SRE, we used both global and local background mutation rates. Using this approach, we detected - in seven different cancer types - numerous "hotspots" for SMs. To support the functional significance of these recurrent noncoding SMs, we further examined their association with the expression level of their target gene (using gene expression data provided by the ICGC and TCGA for samples that were also analyzed by WGS).

Keywords: cancer genomics, enhancers, noncoding genome, regulatory elements

Procedia PDF Downloads 100
18827 Optimization Method of the Number of Berth at Bus Rapid Transit Stations Based on Passenger Flow Demand

Authors: Wei Kunkun, Cao Wanyang, Xu Yujie, Qiao Yuzhi, Liu Yingning

Abstract:

The reasonable design of bus parking spaces can improve the traffic capacity of the station and reduce traffic congestion. In order to reasonably determine the number of berths at BRT (Bus Rapid Transit) stops, it is based on the actual bus rapid transit station observation data, scheduling data, and passenger flow data. Optimize the number of station berths from the perspective of optimizing the balance of supply and demand at the site. Combined with the classical capacity calculation model, this paper first analyzes the important factors affecting the traffic capacity of BRT stops by using SPSS PRO and MATLAB programming software, namely the distribution of BRT stops and the distribution of BRT stop time. Secondly, the method of calculating the number of the classic human capital management (HCM) model is optimized based on the actual passenger demand of the station, and the method applicable to the actual number of station berths is proposed. Taking Gangding Station of Zhongshan Avenue Bus Rapid Transit Corridor in Guangzhou as an example, based on the calculation method proposed in this paper, the number of berths of sub-station 1, sub-station 2 and sub-station 3 is 2, which reduces the road space of the station by 33.3% compared with the previous berth 3 of each sub-station, and returns to social vehicles. Therefore, under the condition of ensuring the passenger flow demand of BRT stations, the road space of the station is reduced, and the road is returned to social vehicles, the traffic capacity of social vehicles is improved, and the traffic capacity and efficiency of the BRT corridor system are improved as a whole.

Keywords: urban transportation, bus rapid transit station, HCM model, capacity, number of berths

Procedia PDF Downloads 93
18826 Design and Development of Motorized Placer for Balloon Uterine Stents in Gynecology

Authors: Metehan Mutlu, Meltem Elitas

Abstract:

This study aims to provide an automated method for placing the balloon uterine stents after hysteroscopy adhesiolysis. Currently, there are no automatized tools to place the balloon uterine stent; therefore, surgeons into the endometrial cavity manually fit it. However, it is very hard to pass the balloon stent through the cervical canal, which is roughly 10mm after the surgery. Our method aims to provide an effective and practical way of placing the stent, by automating the procedure through our designed device. Furthermore, our device does the required tasks fast compared to traditional methods, reduces the narcosis time, and decreases the bacterial contamination risks.

Keywords: balloon uterine stent, endometrial cavity, hysteroscopy, motorized-tool

Procedia PDF Downloads 275
18825 Revolutionizing RNA Extraction: A Unified, Sustainable, and Rapid Protocol for High-Quality Isolation from Diverse Tissues

Authors: Ying Qi Chan, Chunyu Li, Xu Rou Yoyo Ma, Yaya Li, Saber Khederzadeh

Abstract:

In the ever-evolving landscape of genome extraction protocols, the existing methodologies grapple with issues ranging from sub-optimal yields and compromised quality to time-intensive procedures and reliance on hazardous reagents, often necessitating substantial tissue quantities. This predicament is particularly challenging for scientists in developing countries, where resources are limited. Our investigation presents a protocol for the efficient extraction of high-yield RNA from various tissues such as muscle, insect, and plant samples. Noteworthy for its advantages, our protocol stands out as the safest, swiftest (completed in just 38 minutes), most cost-effective (coming in at a mere US$0.017), and highly efficient method in comparison to existing protocols. Notably, our method avoids the use of hazardous or toxic chemicals such as chloroform and phenol and enzymatic agents like RNase and Proteinase K. Our RNA extraction protocol has demonstrated clear advantages over other methods, including commercial kits, in terms of yield. This nucleic acid extraction protocol is more environmentally and research-friendly, suitable for a range of tissues, even in tiny volumes, hence facilitating various genetic diagnosis and researches across the globe.

Keywords: RNA extraction, rapid protocol, universal method, diverse tissues

Procedia PDF Downloads 70
18824 Development of Alpha Spectroscopy Method with Solid State Nuclear Track Detector Using Aluminium Thin Films

Authors: Nidal Dwaikat

Abstract:

This work presents the development of alpha spectroscopy method with Solid-state nuclear track detectors using aluminum thin films. The resolution of this method is high, and it is able to discriminate between alpha particles at different incident energy. It can measure the exact number of alpha particles at specific energy without needing a calibration of alpha track diameter versus alpha energy. This method was tested by using Cf-252 alpha standard source at energies 5.11 Mev, 3.86 MeV and 2.7 MeV, which produced by the variation of detector -standard source distance. On front side, two detectors were covered with two Aluminum thin films and the third detector was kept uncovered. The thickness of Aluminum thin films was selected carefully (using SRIM 2013) such that one of the films will block the lower two alpha particles (3.86 MeV and 2.7 MeV) and the alpha particles at higher energy (5.11 Mev) can penetrate the film and reach the detector’s surface. The second thin film will block alpha particles at lower energy of 2.7 MeV and allow alpha particles at higher two energies (5.11 Mev and 3.86 MeV) to penetrate and produce tracks. For uncovered detector, alpha particles at three different energies can produce tracks on it. For quality assurance and accuracy, the detectors were mounted on thick enough copper substrates to block exposure from the backside. The tracks on the first detector are due to alpha particles at energy of 5.11 MeV. The difference between the tracks number on the first detector and the tracks number on the second detector is due to alpha particles at energy of 3.8 MeV. Finally, by subtracting the tracks number on the second detector from the tracks number on the third detector (uncovered), we can find the tracks number due to alpha particles at energy 2.7 MeV. After knowing the efficiency calibration factor, we can exactly calculate the activity of standard source.

Keywords: aluminium thin film, alpha particles, copper substrate, CR-39 detector

Procedia PDF Downloads 362
18823 Urdu Text Extraction Method from Images

Authors: Samabia Tehsin, Sumaira Kausar

Abstract:

Due to the vast increase in the multimedia data in recent years, efficient and robust retrieval techniques are needed to retrieve and index images/ videos. Text embedded in the images can serve as the strong retrieval tool for images. This is the reason that text extraction is an area of research with increasing attention. English text extraction is the focus of many researchers but very less work has been done on other languages like Urdu. This paper is focusing on Urdu text extraction from video frames. This paper presents a text detection feature set, which has the ability to deal up with most of the problems connected with the text extraction process. To test the validity of the method, it is tested on Urdu news dataset, which gives promising results.

Keywords: caption text, content-based image retrieval, document analysis, text extraction

Procedia PDF Downloads 509
18822 Compensation of Cable Attenuation in Step Current Generators to Enable the Convolution Method for Calibration of Current Transducers

Authors: P. Treyer, M. Kujda, H. Urs

Abstract:

The purpose of this paper is to digitally compensate for the apparent discharge time constant of the coaxial cable so that the current step response is flat and can be used to calibrate current transducers using the convolution method. For proper use of convolution, the step response record length is required to be at least the same as the waveform duration to be evaluated. The current step generator based on the cable discharge is compared to the Blumlein generator. Moreover, the influence of each component of the system on the performance of the step is described, which allows building the appropriate measurement set-up. In the end, the calibration of current viewing resistors dedicated to high current impulse is computed.

Keywords: Blumlein generator, cable attenuation, convolution, current step generator

Procedia PDF Downloads 145