Search results for: computer based instruction
21255 IoT Based Agriculture Monitoring Framework for Sustainable Rice Production
Authors: Armanul Hoque Shaon, Md Baizid Mahmud, Askander Nobi, Md. Raju Ahmed, Md. Jiabul Hoque
Abstract:
In the Internet of Things (IoT), devices are linked to the internet through a wireless network, allowing them to collect and transmit data without the need for a human operator. Agriculture relies heavily on wireless sensors, which are a vital component of the Internet of Things (IoT). This kind of wireless sensor network monitors physical or environmental variables like temperatures, sound, vibration, pressure, or motion without relying on a central location or sink and collaboratively passes its data across the network to be analyzed. As the primary source of plant nutrients, the soil is critical to the agricultural industry's continued growth. We're excited about the prospect of developing an Internet of Things (IoT) solution. To arrange the network, the sink node collects groundwater levels and sends them to the Gateway, which centralizes the data and forwards it to the sensor nodes. The sink node gathers soil moisture data, transmits the mean to the Gateways, and then forwards it to the website for dissemination. The web server is in charge of storing and presenting the moisture in the soil data to the web application's users. Soil characteristics may be collected using a networked method that we developed to improve rice production. Paddy land is running out as the population of our nation grows. The success of this project will be dependent on the appropriate use of the existing land base.Keywords: IoT based agriculture monitoring, intelligent irrigation, communicating network, rice production
Procedia PDF Downloads 15721254 Estimation of Energy Efficiency of Blue Hydrogen Production Onboard of Ships
Authors: Li Chin Law, Epaminondas Mastorakos, Mohd Roslee Othman, Antonis Trakakis
Abstract:
The paper introduces an alternative concept of carbon capture for shipping by using pre-combustion carbon capture technology (Pre-CCS), which was proven to be less energy intensive than post-combustion carbon capture from the engine exhaust. Energy assessment on amine-based post-combustion CCS on LNG-fuelled ships showed that the energy efficiency of CCS ships reduced from 48% to 36.6%. Then, an energy assessment was carried out to compare the power and heat requirements of the most used hydrogen production methods and carbon capture technologies. Steam methane reformer (SMR) was found to be 20% more energy efficient and achieved a higher methane conversion than auto thermal reaction and methane decomposition. Next, pressure swing adsorber (PSA) has shown a lower energy requirement than membrane separation, cryogenic separation, and amine absorption in pre-combustion carbon capture. Hence, an integrated system combining SMR and PSA (SMR-PSA) with waste heat integration (WHR) was proposed. This optimized SMR-based integrated system has achieved 65% of CO₂ reduction with less than 7-percentage point of energy penalty (41.7% of energy efficiency). Further integration of post-combustion CCS with the SMR-PSA integrated system improved carbon capture rate to 86.3% with 9-percentage points of energy penalty (39% energy efficiency). The proposed system was shown to be able to meet the carbon reduction targets set by International Maritime Organization (IMO) with certain energy penalties.Keywords: shipping, decarbonisation, alternative fuels, low carbon, hydrogen, carbon capture
Procedia PDF Downloads 8421253 Food Design as a University-Industry Collaboration Project: An Experience Design on Controlling Chocolate Consumption and Long-Term Eating Behavior
Authors: Büşra Durmaz, Füsun Curaoğlu
Abstract:
While technology-oriented developments in the modern world change our perceptions of time and speed, they also force our food consumption patterns, such as getting pleasure from what we eat and eating slowly. The habit of eating quickly and hastily causes not only the feeling of not understanding the taste of the food eaten but also the inability to postpone the feeling of satiety and, therefore, many health problems. In this context, especially in the last ten years, in the field of industrial design, food manufacturers for healthy living and consumption have been collaborating with industrial designers on food design. The consumers of the new century, who are in an uncontrolled time intensity, receive support from small snacks as a source of happiness and pleasure in the little time intervals they can spare. At this point, especially chocolate has been a source of happiness for its consumers as a source of both happiness and pleasure for hundreds of years. However, when the portions have eaten cannot be controlled, a pleasure food such as chocolate can cause both health problems and many emotional problems, especially the feeling of guilt. Fast food, which is called food that is prepared and consumed quickly, has been increasing rapidly around the world in recent years. This study covers the process and results of a chocolate design based on the user experience of a university-industry cooperation project carried out within the scope of Eskişehir Technical University graduation projects. The aim of the project is a creative product design that will enable the user to experience chocolate consumption with a healthy eating approach. For this, while concepts such as pleasure, satiety, and taste are discussed; A survey with 151 people and semi-structured face-to-face interviews with 7 people during the experience design process within the scope of the user-oriented design approach, mainly literature review, within the scope of main topics such as mouth anatomy, tongue structure, taste, the functions of the eating action in the brain, hormones and chocolate, video A case study based on the research paradigm of Qualitative Research was structured within the scope of different research processes such as analysis and project diaries. As a result of the research, it has been reached that the melting in the mouth is the preferred experience of the users in order to spread the experience of eating chocolate for a long time based on pleasure while eating chocolate with healthy portions. In this context, researches about the production of sketches, mock-ups and prototypes of the product are included in the study. As a result, a product packaging design has been made that supports the active role of the senses such as sight, smell and hearing, where consumption begins, in order to consume chocolate by melting and to actively secrete the most important stimulus salivary glands in order to provide a healthy and long-term pleasure-based consumption.Keywords: chocolate, eating habit, pleasure, saturation, sense of taste
Procedia PDF Downloads 8521252 Organic Light Emitting Devices Based on Low Symmetry Coordination Structured Lanthanide Complexes
Authors: Zubair Ahmed, Andrea Barbieri
Abstract:
The need to reduce energy consumption has prompted a considerable research effort for developing alternative energy-efficient lighting systems to replace conventional light sources (i.e., incandescent and fluorescent lamps). Organic light emitting device (OLED) technology offers the distinctive possibility to fabricate large area flat devices by vacuum or solution processing. Lanthanide β-diketonates complexes, due to unique photophysical properties of Ln(III) ions, have been explored as emitting layers in OLED displays and in solid-state lighting (SSL) in order to achieve high efficiency and color purity. For such applications, the excellent photoluminescence quantum yield (PLQY) and stability are the two key points that can be achieved simply by selecting the proper organic ligands around the Ln ion in a coordination sphere. Regarding the strategies to enhance the PLQY, the most common is the suppression of the radiationless deactivation pathways due to the presence of high-frequency oscillators (e.g., OH, –CH groups) around the Ln centre. Recently, a different approach to maximize the PLQY of Ln(β-DKs) has been proposed (named 'Escalate Coordination Anisotropy', ECA). It is based on the assumption that coordinating the Ln ion with different ligands will break the centrosymmetry of the molecule leading to less forbidden transitions (loosening the constraints of the Laporte rule). The OLEDs based on such complexes are available, but with low efficiency and stability. In order to get efficient devices, there is a need to develop some new Ln complexes with enhanced PLQYs and stabilities. For this purpose, the Ln complexes, both visible and (NIR) emitting, of variant coordination structures based on the various fluorinated/non-fluorinated β-diketones and O/N-donor neutral ligands were synthesized using a one step in situ method. In this method, the β-diketones, base, LnCl₃.nH₂O and neutral ligands were mixed in a 3:3:1:1 M ratio in ethanol that gave air and moisture stable complexes. Further, they were characterized by means of elemental analysis, NMR spectroscopy and single crystal X-ray diffraction. Thereafter, their photophysical properties were studied to select the best complexes for the fabrication of stable and efficient OLEDs. Finally, the OLEDs were fabricated and investigated using these complexes as emitting layers along with other organic layers like NPB,N,N′-Di(1-naphthyl)-N,N′-diphenyl-(1,1′-biphenyl)-4,4′-diamine (hole-transporting layer), BCP, 2,9-Dimethyl-4,7-diphenyl-1,10-phenanthroline (hole-blocker) and Alq3 (electron-transporting layer). The layers were sequentially deposited under high vacuum environment by thermal evaporation onto ITO glass substrates. Moreover, co-deposition techniques were used to improve charge transport in the devices and to avoid quenching phenomena. The devices show strong electroluminescence at 612, 998, 1064 and 1534 nm corresponding to ⁵D₀ →⁷F₂(Eu), ²F₅/₂ → ²F₇/₂ (Yb), ⁴F₃/₂→ ⁴I₉/₂ (Nd) and ⁴I1₃/₂→ ⁴I1₅/₂ (Er). All the devices fabricated show good efficiency as well as stability.Keywords: electroluminescence, lanthanides, paramagnetic NMR, photoluminescence
Procedia PDF Downloads 12521251 Improving Fingerprinting-Based Localization (FPL) System Using Generative Artificial Intelligence (GAI)
Authors: Getaneh Berie Tarekegn, Li-Chia Tai
Abstract:
With the rapid advancement of artificial intelligence, low-power built-in sensors on Internet of Things devices, and communication technologies, location-aware services have become increasingly popular and have permeated every aspect of people’s lives. Global navigation satellite systems (GNSSs) are the default method of providing continuous positioning services for ground and aerial vehicles, as well as consumer devices (smartphones, watches, notepads, etc.). However, the environment affects satellite positioning systems, particularly indoors, in dense urban and suburban cities enclosed by skyscrapers, or when deep shadows obscure satellite signals. This is because (1) indoor environments are more complicated due to the presence of many objects surrounding them; (2) reflection within the building is highly dependent on the surrounding environment, including the positions of objects and human activity; and (3) satellite signals cannot be reached in an indoor environment, and GNSS doesn't have enough power to penetrate building walls. GPS is also highly power-hungry, which poses a severe challenge for battery-powered IoT devices. Due to these challenges, IoT applications are limited. Consequently, precise, seamless, and ubiquitous Positioning, Navigation and Timing (PNT) systems are crucial for many artificial intelligence Internet of Things (AI-IoT) applications in the era of smart cities. Their applications include traffic monitoring, emergency alarming, environmental monitoring, location-based advertising, intelligent transportation, and smart health care. This paper proposes a generative AI-based positioning scheme for large-scale wireless settings using fingerprinting techniques. In this article, we presented a novel semi-supervised deep convolutional generative adversarial network (S-DCGAN)-based radio map construction method for real-time device localization. We also employed a reliable signal fingerprint feature extraction method with t-distributed stochastic neighbor embedding (t-SNE), which extracts dominant features while eliminating noise from hybrid WLAN and long-term evolution (LTE) fingerprints. The proposed scheme reduced the workload of site surveying required to build the fingerprint database by up to 78.5% and significantly improved positioning accuracy. The results show that the average positioning error of GAILoc is less than 0.39 m, and more than 90% of the errors are less than 0.82 m. According to numerical results, SRCLoc improves positioning performance and reduces radio map construction costs significantly compared to traditional methods.Keywords: location-aware services, feature extraction technique, generative adversarial network, long short-term memory, support vector machine
Procedia PDF Downloads 5621250 Controlled Release of Glucosamine from Pluronic-Based Hydrogels for the Treatment of Osteoarthritis
Authors: Papon Thamvasupong, Kwanchanok Viravaidya-Pasuwat
Abstract:
Osteoarthritis affects a lot of people worldwide. Local injection of glucosamine is one of the alternative treatment methods to replenish the natural lubrication of cartilage. However, multiple injections can potentially lead to possible bacterial infection. Therefore, a drug delivery system is desired to reduce the frequencies of injections. A hydrogel is one of the delivery systems that can control the release of drugs. Thermo-reversible hydrogels can be beneficial to the drug delivery system especially in the local injection route because this formulation can change from liquid to gel after getting into human body. Once the gel is in the body, it will slowly release the drug in a controlled manner. In this study, various formulations of Pluronic-based hydrogels were synthesized for the controlled release of glucosamine. One of the challenges of the Pluronic controlled release system is its fast dissolution rate. To overcome this problem, alginate and calcium sulfate (CaSO4) were added to the polymer solution. The characteristics of the hydrogels were investigated including the gelation temperature, gelation time, hydrogel dissolution and glucosamine release mechanism. Finally, a mathematical model of glucosamine release from Pluronic-alginate-hyaluronic acid hydrogel was developed. Our results have shown that crosslinking Pluronic gel with alginate did not significantly extend the dissolution rate of the gel. Moreover, the gel dissolution profiles and the glucosamine release mechanisms were best described using the zeroth-order kinetic model, indicating that the release of glucosamine was primarily governed by the gel dissolution.Keywords: controlled release, drug delivery system, glucosamine, pluronic, thermoreversible hydrogel
Procedia PDF Downloads 27721249 Characteristic of Taro (Colocasia esculenta), Seaweed (Gracilaria Sp.), and Fishes Bone Collagens Flour Based Analog Rice
Authors: Y. S. Darmanto, P. H. Riyadi, S. Susanti
Abstract:
Recently, approximately 9.1 million people of 237.56 million of Indonesian population suffer diabetes. Such condition was caused by high rice consumption of most Indonesian people. It has been known that rice contains low amylose, high calorie, and possesses hyperglycemic properties. Through this study, we tried to solve that problem by creating a super food in order to provide an alternative healthy and balanced diet. We formulated Taro and Seaweed flour based analog rice that fortified by various fishes bone collagens. Corms of Taro contain easily digestible starch and seaweed is rich in fiber, vitamin, and mineral. That mixture was fortified with collagen-containing unique amino acids such as glysine, lysine, alanine, arginine, proline, and hydroxyprolin. Subsequently, super analog rice was characterized about its nutritional composition such are proximate analyses, water, dietary fiber and amylose content. Furthermore, its morphological structure was analyzed by using scanning electron microscopy while the level of consumer preferences was performed by hedonic test. Results demonstrated that fortification by using various fishes bone collagen into analog rice were significantly different in nutritional composition, morphological structure as well as its preferences. Thus, this study was expected as new avenue in functional food discovery especially in the treatment and prevention of diabetic diseases.Keywords: analogue rice, taro, seaweed, collagen
Procedia PDF Downloads 26821248 Artificial Intelligence for Generative Modelling
Authors: Shryas Bhurat, Aryan Vashistha, Sampreet Dinakar Nayak, Ayush Gupta
Abstract:
As the technology is advancing more towards high computational resources, there is a paradigm shift in the usage of these resources to optimize the design process. This paper discusses the usage of ‘Generative Design using Artificial Intelligence’ to build better models that adapt the operations like selection, mutation, and crossover to generate results. The human mind thinks of the simplest approach while designing an object, but the intelligence learns from the past & designs the complex optimized CAD Models. Generative Design takes the boundary conditions and comes up with multiple solutions with iterations to come up with a sturdy design with the most optimal parameter that is given, saving huge amounts of time & resources. The new production techniques that are at our disposal allow us to use additive manufacturing, 3D printing, and other innovative manufacturing techniques to save resources and design artistically engineered CAD Models. Also, this paper discusses the Genetic Algorithm, the Non-Domination technique to choose the right results using biomimicry that has evolved for current habitation for millions of years. The computer uses parametric models to generate newer models using an iterative approach & uses cloud computing to store these iterative designs. The later part of the paper compares the topology optimization technology with Generative Design that is previously being used to generate CAD Models. Finally, this paper shows the performance of algorithms and how these algorithms help in designing resource-efficient models.Keywords: genetic algorithm, bio mimicry, generative modeling, non-dominant techniques
Procedia PDF Downloads 15421247 Statistical Time-Series and Neural Architecture of Malaria Patients Records in Lagos, Nigeria
Authors: Akinbo Razak Yinka, Adesanya Kehinde Kazeem, Oladokun Oluwagbenga Peter
Abstract:
Time series data are sequences of observations collected over a period of time. Such data can be used to predict health outcomes, such as disease progression, mortality, hospitalization, etc. The Statistical approach is based on mathematical models that capture the patterns and trends of the data, such as autocorrelation, seasonality, and noise, while Neural methods are based on artificial neural networks, which are computational models that mimic the structure and function of biological neurons. This paper compared both parametric and non-parametric time series models of patients treated for malaria in Maternal and Child Health Centres in Lagos State, Nigeria. The forecast methods considered linear regression, Integrated Moving Average, ARIMA and SARIMA Modeling for the parametric approach, while Multilayer Perceptron (MLP) and Long Short-Term Memory (LSTM) Network were used for the non-parametric model. The performance of each method is evaluated using the Mean Absolute Error (MAE), R-squared (R2) and Root Mean Square Error (RMSE) as criteria to determine the accuracy of each model. The study revealed that the best performance in terms of error was found in MLP, followed by the LSTM and ARIMA models. In addition, the Bootstrap Aggregating technique was used to make robust forecasts when there are uncertainties in the data.Keywords: ARIMA, bootstrap aggregation, MLP, LSTM, SARIMA, time-series analysis
Procedia PDF Downloads 8321246 Cross Section Measurement for Formation of Metastable State of ¹¹¹ᵐCd through ¹¹¹Cd (γ, γ`) ¹¹¹ᵐCd Reaction Induced by Bremsstrahlung Generated through 6 MeV Electrons
Authors: Vishal D. Bharud, B. J. Patil, S. S. Dahiwale, V. N. Bhoraskar, S. D. Dhole
Abstract:
Photon induced average reaction cross section of ¹¹¹Cd (γ, γ`) ¹¹¹ᵐCd reaction was experimentally determined for the bremsstrahlung energy spectrum of 6 MeV by utilizing the activation and offline γ-ray spectrometric techniques. The 6 MeV electron accelerator Racetrack Microtron of Savitribai Phule Pune University, Pune was used for the experimental work. The bremsstrahlung spectrum generated by bombarding 6 MeV electrons on lead target was theoretically estimated by FLUKA code. Bremsstrahlung radiation can have energies exceeding the threshold of the particle emission, which is normally above 6 MeV. Photons of energies below the particle emission threshold undergo absorption into discrete energy levels, with possibility of exciting nuclei to excited state including metastable state. The ¹¹¹Cd (γ, γ`) ¹¹¹ᵐCd reaction cross sections were calculated at different energies of bombarding Photon by using the TALYS 1.8 computer code with a default parameter. The focus of the present work was to study the (γ,γ’) reaction for exciting ¹¹¹Cd nuclei to metastable states which have threshold energy below 3 MeV. The flux weighted average cross section was obtained from the theoretical values of TALYS 1.8 and TENDL 2017 and is found to be in good agreement with the present experimental cross section.Keywords: bremsstrahlung, cross section, FLUKA, TALYS-1.8
Procedia PDF Downloads 17521245 Test Suite Optimization Using an Effective Meta-Heuristic BAT Algorithm
Authors: Anuradha Chug, Sunali Gandhi
Abstract:
Regression Testing is a very expensive and time-consuming process carried out to ensure the validity of modified software. Due to the availability of insufficient resources to re-execute all the test cases in time constrained environment, efforts are going on to generate test data automatically without human efforts. Many search based techniques have been proposed to generate efficient, effective as well as optimized test data, so that the overall cost of the software testing can be minimized. The generated test data should be able to uncover all potential lapses that exist in the software or product. Inspired from the natural behavior of bat for searching her food sources, current study employed a meta-heuristic, search-based bat algorithm for optimizing the test data on the basis certain parameters without compromising their effectiveness. Mathematical functions are also applied that can effectively filter out the redundant test data. As many as 50 Java programs are used to check the effectiveness of proposed test data generation and it has been found that 86% saving in testing efforts can be achieved using bat algorithm while covering 100% of the software code for testing. Bat algorithm was found to be more efficient in terms of simplicity and flexibility when the results were compared with another nature inspired algorithms such as Firefly Algorithm (FA), Hill Climbing Algorithm (HC) and Ant Colony Optimization (ACO). The output of this study would be useful to testers as they can achieve 100% path coverage for testing with minimum number of test cases.Keywords: regression testing, test case selection, test case prioritization, genetic algorithm, bat algorithm
Procedia PDF Downloads 38421244 Characterization of InGaAsP/InP Quantum Well Lasers
Authors: K. Melouk, M. Dellakrachaï
Abstract:
Analytical formula for the optical gain based on a simple parabolic-band by introducing theoretical expressions for the quantized energy is presented. The model used in this treatment take into account the effects of intraband relaxation. It is shown, as a result, that the gain for the TE mode is larger than that for TM mode and the presence of acceptor impurity increase the peak gain.Keywords: InGaAsP, laser, quantum well, semiconductor
Procedia PDF Downloads 37621243 The Automatisation of Dictionary-Based Annotation in a Parallel Corpus of Old English
Authors: Ana Elvira Ojanguren Lopez, Javier Martin Arista
Abstract:
The aims of this paper are to present the automatisation procedure adopted in the implementation of a parallel corpus of Old English, as well as, to assess the progress of automatisation with respect to tagging, annotation, and lemmatisation. The corpus consists of an aligned parallel text with word-for-word comparison Old English-English that provides the Old English segment with inflectional form tagging (gloss, lemma, category, and inflection) and lemma annotation (spelling, meaning, inflectional class, paradigm, word-formation and secondary sources). This parallel corpus is intended to fill a gap in the field of Old English, in which no parallel and/or lemmatised corpora are available, while the average amount of corpus annotation is low. With this background, this presentation has two main parts. The first part, which focuses on tagging and annotation, selects the layouts and fields of lexical databases that are relevant for these tasks. Most information used for the annotation of the corpus can be retrieved from the lexical and morphological database Nerthus and the database of secondary sources Freya. These are the sources of linguistic and metalinguistic information that will be used for the annotation of the lemmas of the corpus, including morphological and semantic aspects as well as the references to the secondary sources that deal with the lemmas in question. Although substantially adapted and re-interpreted, the lemmatised part of these databases draws on the standard dictionaries of Old English, including The Student's Dictionary of Anglo-Saxon, An Anglo-Saxon Dictionary, and A Concise Anglo-Saxon Dictionary. The second part of this paper deals with lemmatisation. It presents the lemmatiser Norna, which has been implemented on Filemaker software. It is based on a concordance and an index to the Dictionary of Old English Corpus, which comprises around three thousand texts and three million words. In its present state, the lemmatiser Norna can assign lemma to around 80% of textual forms on an automatic basis, by searching the index and the concordance for prefixes, stems and inflectional endings. The conclusions of this presentation insist on the limits of the automatisation of dictionary-based annotation in a parallel corpus. While the tagging and annotation are largely automatic even at the present stage, the automatisation of alignment is pending for future research. Lemmatisation and morphological tagging are expected to be fully automatic in the near future, once the database of secondary sources Freya and the lemmatiser Norna have been completed.Keywords: corpus linguistics, historical linguistics, old English, parallel corpus
Procedia PDF Downloads 21421242 Applying Artificial Neural Networks to Predict Speed Skater Impact Concussion Risk
Authors: Yilin Liao, Hewen Li, Paula McConvey
Abstract:
Speed skaters often face a risk of concussion when they fall on the ice floor and impact crash mats during practices and competitive races. Several variables, including those related to the skater, the crash mat, and the impact position (body side/head/feet impact), are believed to influence the severity of the skater's concussion. While computer simulation modeling can be employed to analyze these accidents, the simulation process is time-consuming and does not provide rapid information for coaches and teams to assess the skater's injury risk in competitive events. This research paper promotes the exploration of the feasibility of using AI techniques for evaluating skater’s potential concussion severity, and to develop a fast concussion prediction tool using artificial neural networks to reduce the risk of treatment delays for injured skaters. The primary data is collected through virtual tests and physical experiments designed to simulate skater-mat impact. It is then analyzed to identify patterns and correlations; finally, it is used to train and fine-tune the artificial neural networks for accurate prediction. The development of the prediction tool by employing machine learning strategies contributes to the application of AI methods in sports science and has theoretical involvements for using AI techniques in predicting and preventing sports-related injuries.Keywords: artificial neural networks, concussion, machine learning, impact, speed skater
Procedia PDF Downloads 11621241 Prediction of Marijuana Use among Iranian Early Youth: an Application of Integrative Model of Behavioral Prediction
Authors: Mehdi Mirzaei Alavijeh, Farzad Jalilian
Abstract:
Background: Marijuana is the most widely used illicit drug worldwide, especially among adolescents and young adults, which can cause numerous complications. The aim of this study was to determine the pattern, motivation use, and factors related to marijuana use among Iranian youths based on the integrative model of behavioral prediction Methods: A cross-sectional study was conducted among 174 youths marijuana user in Kermanshah County and Isfahan County, during summer 2014 which was selected with the convenience sampling for participation in this study. A self-reporting questionnaire was applied for collecting data. Data were analyzed by SPSS version 21 using bivariate correlations and linear regression statistical tests. Results: The mean marijuana use of respondents was 4.60 times at during week [95% CI: 4.06, 5.15]. Linear regression statistical showed, the structures of integrative model of behavioral prediction accounted for 36% of the variation in the outcome measure of the marijuana use at during week (R2 = 36% & P < 0.001); and among them attitude, marijuana refuse, and subjective norms were a stronger predictors. Conclusion: Comprehensive health education and prevention programs need to emphasize on cognitive factors that predict youth’s health-related behaviors. Based on our findings it seems, designing educational and behavioral intervention for reducing positive belief about marijuana, marijuana self-efficacy refuse promotion and reduce subjective norms encourage marijuana use has an effective potential to protect youths marijuana use.Keywords: marijuana, youth, integrative model of behavioral prediction, Iran
Procedia PDF Downloads 55721240 Structure-Based Virtual Screening and in Silico Toxicity Test of Compounds against Mycobacterium tuberculosis 7,8-Diaminopelargonic Acid Aminotransferase (MtbBioA)
Authors: Junie B. Billones, Maria Constancia O. Carrillo, Voltaire G. Organo, Stephani Joy Y. Macalino, Inno A. Emnacen, Jamie Bernadette A. Sy
Abstract:
One of the major interferences in the Philippines’ tuberculosis control program is the widespread prevalence of Mtb strains that are resistant to known drugs, such as the MDR-TB (Multi Drug Resistant Tuberculosis) and XDR-TB (Extensively Drug Resistant Tuberculosis). Therefore, there is a pressing need to search for novel Mtb drug targets in order to be able to combat these drug resistant strains. The enzyme 7,8-diaminopelargonic acid aminotransferase enzyme, or more commonly known as BioA, is one such ideal target, as it is known that humans do not possess this enzyme. BioA primarily plays a key role in Mtb’s lipid biosynthesis pathway; more specifically in the synthesis of the enzyme cofactor biotin. In this study, structure-based pharmacophore screening, docking, and ADMET evaluation of compounds obtained from the DrugBank chemical database were performed against the MtbBioA enzyme. Results of the screening, docking, ADMET, and TOPKAT calculations revealed that out of the 6,516 compounds in the library, only 7 compounds indicated more favorable binding energies as compared to the enzyme’s known inhibitor, amiclenomycin (ACM), as well as good solubility and toxicity properties. Moreover, out of these 7 compounds, Molecule 6 exhibited the best solubility and toxicity properties. In the future, these lead compounds may then be subjected to bioactivity assays in vitro or in vivo for further evaluation of its therapeutic efficacy.Keywords: 7, 8-diaminopelargonic acid aminotransferase, BioA, pharmacophore, molecular docking, ADMET, TOPKAT
Procedia PDF Downloads 46021239 Assessing the Walkability and Urban Design Qualities of Campus Streets
Authors: Zhehao Zhang
Abstract:
Walking has become an indispensable and sustainable way of travel for college students in their daily lives; campus street is an important carrier for students to walk and take part in a variety of activities, improving the walkability of campus streets plays an important role in optimizing the quality of campus space environment, promoting the campus walking system and inducing multiple walking behaviors. The purpose of this paper is to explore the effect of campus layout, facility distribution, and location site selection on the walkability of campus streets, and assess the street design qualities from the elements of imageability, enclosure, complexity, transparency, and human scale, and further examines the relationship between street-level urban design perceptual qualities and walkability and its effect on walking behavior in the campus. Taking Tianjin University as the research object, this paper uses the optimized walk score method based on walking frequency, variety, and distance to evaluate the walkability of streets from a macro perspective and measures the urban design qualities in terms of the calculation of street physical environment characteristics, as well as uses behavior annotation and street image data to establish temporal and spatial behavior database to analyze walking activity from the microscopic view. In addition, based on the conclusions, the improvement and design strategy will be presented from the aspects of the built walking environment, street vitality, and walking behavior.Keywords: walkability, streetscapes, pedestrian activity, walk score
Procedia PDF Downloads 14721238 Geoplanology Modeling and Applications Engineering of Earth in Spatial Planning Related with Geological Hazard in Cilegon, Banten, Indonesia
Authors: Muhammad L. A. Dwiyoga
Abstract:
The condition of a spatial land in the industrial park needs special attention to be studied more deeply. Geoplanology modeling can help arrange area according to his ability. This research method is to perform the analysis of remote sensing, Geographic Information System, and more comprehensive analysis to determine geological characteristics and the ability to land on the area of research and its relation to the geological disaster. Cilegon is part of Banten province located in western Java, and the direction of the north is the Strait of Borneo. While the southern part is bordering the Indian Ocean. Morphology study area is located in the highlands to low. In the highlands of identified potential landslide prone, whereas in low-lying areas of potential flooding. Moreover, in the study area has the potential prone to earthquakes, this is due to the proximity of enough research to Mount Krakatau and Subdcution Zone. From the results of this study show that the study area has a susceptibility to landslides located around the District Waringinkurung. While the region as a potential flood areas in the District of Cilegon and surrounding areas. Based on the seismic data, this area includes zones with a range of magnitude 1.5 to 5.5 magnitude at a depth of 1 to 60 Km. As for the ability of its territory, based on the analyzes and studies carried out the need for renewal of the map Spatial Plan that has been made, considering the development of a fairly rapid Cilegon area.Keywords: geoplanology, spatial plan, geological hazard, cilegon, Indonesia
Procedia PDF Downloads 50421237 Optimal Pricing Based on Real Estate Demand Data
Authors: Vanessa Kummer, Maik Meusel
Abstract:
Real estate demand estimates are typically derived from transaction data. However, in regions with excess demand, transactions are driven by supply and therefore do not indicate what people are actually looking for. To estimate the demand for housing in Switzerland, search subscriptions from all important Swiss real estate platforms are used. These data do, however, suffer from missing information—for example, many users do not specify how many rooms they would like or what price they would be willing to pay. In economic analyses, it is often the case that only complete data is used. Usually, however, the proportion of complete data is rather small which leads to most information being neglected. Also, the data might have a strong distortion if it is complete. In addition, the reason that data is missing might itself also contain information, which is however ignored with that approach. An interesting issue is, therefore, if for economic analyses such as the one at hand, there is an added value by using the whole data set with the imputed missing values compared to using the usually small percentage of complete data (baseline). Also, it is interesting to see how different algorithms affect that result. The imputation of the missing data is done using unsupervised learning. Out of the numerous unsupervised learning approaches, the most common ones, such as clustering, principal component analysis, or neural networks techniques are applied. By training the model iteratively on the imputed data and, thereby, including the information of all data into the model, the distortion of the first training set—the complete data—vanishes. In a next step, the performances of the algorithms are measured. This is done by randomly creating missing values in subsets of the data, estimating those values with the relevant algorithms and several parameter combinations, and comparing the estimates to the actual data. After having found the optimal parameter set for each algorithm, the missing values are being imputed. Using the resulting data sets, the next step is to estimate the willingness to pay for real estate. This is done by fitting price distributions for real estate properties with certain characteristics, such as the region or the number of rooms. Based on these distributions, survival functions are computed to obtain the functional relationship between characteristics and selling probabilities. Comparing the survival functions shows that estimates which are based on imputed data sets do not differ significantly from each other; however, the demand estimate that is derived from the baseline data does. This indicates that the baseline data set does not include all available information and is therefore not representative for the entire sample. Also, demand estimates derived from the whole data set are much more accurate than the baseline estimation. Thus, in order to obtain optimal results, it is important to make use of all available data, even though it involves additional procedures such as data imputation.Keywords: demand estimate, missing-data imputation, real estate, unsupervised learning
Procedia PDF Downloads 29221236 Finite Element Analysis and Design Optimization of Stent and Balloon System
Authors: V. Hashim, P. N. Dileep
Abstract:
Stent implantation is being seen as the most successful method to treat coronary artery diseases. Different types of stents are available in the market these days and the success of a stent implantation greatly depends on the proper selection of a suitable stent for a patient. Computer numerical simulation is the cost effective way to choose the compatible stent. Studies confirm that the design characteristics of stent do have great importance with regards to the pressure it can sustain, the maximum displacement it can produce, the developed stress concentration and so on. In this paper different designs of stent were analyzed together with balloon to optimize the stent and balloon system. Commercially available stent Palmaz-Schatz has been selected for analysis. Abaqus software is used to simulate the system. This work is the finite element analysis of the artery stent implant to find out the design factors affecting the stress and strain. The work consists of two phases. In the first phase, stress distribution of three models were compared - stent without balloon, stent with balloon of equal length and stent with balloon of extra length than stent. In second phase, three different design models of Palmaz-Schatz stent were compared by keeping the balloon length constant. The results obtained from analysis shows that, the design of the strut have strong effect on the stress distribution. A design with chamfered slots found better results. The length of the balloon also has influence on stress concentration of the stent. Increase in length of the balloon will reduce stress, but will increase dog boning effect.Keywords: coronary stent, finite element analysis, restenosis, stress concentration
Procedia PDF Downloads 62521235 Melanoma and Non-Melanoma, Skin Lesion Classification, Using a Deep Learning Model
Authors: Shaira L. Kee, Michael Aaron G. Sy, Myles Joshua T. Tan, Hezerul Abdul Karim, Nouar AlDahoul
Abstract:
Skin diseases are considered the fourth most common disease, with melanoma and non-melanoma skin cancer as the most common type of cancer in Caucasians. The alarming increase in Skin Cancer cases shows an urgent need for further research to improve diagnostic methods, as early diagnosis can significantly improve the 5-year survival rate. Machine Learning algorithms for image pattern analysis in diagnosing skin lesions can dramatically increase the accuracy rate of detection and decrease possible human errors. Several studies have shown the diagnostic performance of computer algorithms outperformed dermatologists. However, existing methods still need improvements to reduce diagnostic errors and generate efficient and accurate results. Our paper proposes an ensemble method to classify dermoscopic images into benign and malignant skin lesions. The experiments were conducted using the International Skin Imaging Collaboration (ISIC) image samples. The dataset contains 3,297 dermoscopic images with benign and malignant categories. The results show improvement in performance with an accuracy of 88% and an F1 score of 87%, outperforming other existing models such as support vector machine (SVM), Residual network (ResNet50), EfficientNetB0, EfficientNetB4, and VGG16.Keywords: deep learning - VGG16 - efficientNet - CNN – ensemble – dermoscopic images - melanoma
Procedia PDF Downloads 8621234 Hybrid Wind Solar Gas Reliability Optimization Using Harmony Search under Performance and Budget Constraints
Authors: Meziane Rachid, Boufala Seddik, Hamzi Amar, Amara Mohamed
Abstract:
Today’s energy industry seeks maximum benefit with maximum reliability. In order to achieve this goal, design engineers depend on reliability optimization techniques. This work uses a harmony search algorithm (HS) meta-heuristic optimization method to solve the problem of wind-Solar-Gas power systems design optimization. We consider the case where redundant electrical components are chosen to achieve a desirable level of reliability. The electrical power components of the system are characterized by their cost, capacity and reliability. The reliability is considered in this work as the ability to satisfy the consumer demand which is represented as a piecewise cumulative load curve. This definition of the reliability index is widely used for power systems. The proposed meta-heuristic seeks for the optimal design of series-parallel power systems in which a multiple choice of wind generators, transformers and lines are allowed from a list of product available in the market. Our approach has the advantage to allow electrical power components with different parameters to be allocated in electrical power systems. To allow fast reliability estimation, a universal moment generating function (UMGF) method is applied. A computer program has been developed to implement the UMGF and the HS algorithm. An illustrative example is presented.Keywords: reliability optimization, harmony search optimization (HSA), universal generating function (UMGF)
Procedia PDF Downloads 57921233 Deliberation of Daily Evapotranspiration and Evaporative Fraction Based on Remote Sensing Data
Authors: J. Bahrawi, M. Elhag
Abstract:
Estimation of evapotranspiration is always a major component in water resources management. Traditional techniques of calculating daily evapotranspiration based on field measurements are valid only for local scales. Earth observation satellite sensors are thus used to overcome difficulties in obtaining daily evapotranspiration measurements on regional scale. The Surface Energy Balance System (SEBS) model was adopted to estimate daily evapotranspiration and relative evaporation along with other land surface energy fluxes. The model requires agro-climatic data that improve the model outputs. Advance Along Track Scanning Radiometer (AATSR) and Medium Spectral Resolution Imaging Spectrometer (MERIS) imageries were used to estimate the daily evapotranspiration and relative evaporation over the entire Nile Delta region in Egypt supported by meteorological data collected from six different weather stations located within the study area. Daily evapotranspiration maps derived from SEBS model show a strong agreement with actual ground-truth data taken from 92 points uniformly distributed all over the study area. Moreover, daily evapotranspiration and relative evaporation are strongly correlated. The reliable estimation of daily evapotranspiration supports the decision makers to review the current land use practices in terms of water management, while enabling them to propose proper land use changes.Keywords: daily evapotranspiration, relative evaporation, SEBS, AATSR, MERIS, Nile Delta
Procedia PDF Downloads 26321232 Frontier Dynamic Tracking in the Field of Urban Plant and Habitat Research: Data Visualization and Analysis Based on Journal Literature
Authors: Shao Qi
Abstract:
The article uses the CiteSpace knowledge graph analysis tool to sort and visualize the journal literature on urban plants and habitats in the Web of Science and China National Knowledge Infrastructure databases. Based on a comprehensive interpretation of the visualization results of various data sources and the description of the intrinsic relationship between high-frequency keywords using knowledge mapping, the research hotspots, processes and evolution trends in this field are analyzed. Relevant case studies are also conducted for the hotspot contents to explore the means of landscape intervention and synthesize the understanding of research theories. The results show that (1) from 1999 to 2022, the research direction of urban plants and habitats gradually changed from focusing on plant and animal extinction and biological invasion to the field of human urban habitat creation, ecological restoration, and ecosystem services. (2) The results of keyword emergence and keyword growth trend analysis show that habitat creation research has shown a rapid and stable growth trend since 2017, and ecological restoration has gained long-term sustained attention since 2004. The hotspots of future research on urban plants and habitats in China may focus on habitat creation and ecological restoration.Keywords: research trends, visual analysis, habitat creation, ecological restoration
Procedia PDF Downloads 6521231 Evaluation of Polymerisation Shrinkage of Randomly Oriented Micro-Sized Fibre Reinforced Dental Composites Using Fibre-Bragg Grating Sensors and Their Correlation with Degree of Conversion
Authors: Sonam Behl, Raju, Ginu Rajan, Paul Farrar, B. Gangadhara Prusty
Abstract:
Reinforcing dental composites with micro-sized fibres can significantly improve the physio-mechanical properties of dental composites. The short fibres can be oriented randomly within dental composites, thus providing quasi-isotropic reinforcing efficiency unlike unidirectional/bidirectional fibre reinforced composites enhancing anisotropic properties. Thus, short fibres reinforced dental composites are getting popular among practitioners. However, despite their popularity, resin-based dental composites are prone to failure on account of shrinkage during photo polymerisation. The shrinkage in the structure may lead to marginal gap formation, causing secondary caries, thus ultimately inducing failure of the restoration. The traditional methods to evaluate polymerisation shrinkage using strain gauges, density-based measurements, dilatometer, or bonded-disk focuses on average value of volumetric shrinkage. Moreover, the results obtained from traditional methods are sensitive to the specimen geometry. The present research aims to evaluate the real-time shrinkage strain at selected locations in the material with the help of optical fibre Bragg grating (FBG) sensors. Due to the miniature size (diameter 250 µm) of FBG sensors, they can be easily embedded into small samples of dental composites. Furthermore, an FBG array into the system can map the real-time shrinkage strain at different regions of the composite. The evaluation of real-time monitoring of shrinkage values may help to optimise the physio-mechanical properties of composites. Previously, FBG sensors have been able to rightfully measure polymerisation strains of anisotropic (unidirectional or bidirectional) reinforced dental composites. However, very limited study exists to establish the validity of FBG based sensors to evaluate volumetric shrinkage for randomly oriented fibres reinforced composites. The present study aims to fill this research gap and is focussed on establishing the usage of FBG based sensors for evaluating the shrinkage of dental composites reinforced with randomly oriented fibres. Three groups of specimens were prepared by mixing the resin (80% UDMA/20% TEGDMA) with 55% of silane treated BaAlSiO₂ particulate fillers or by adding 5% of micro-sized fibres of diameter 5 µm, and length 250/350 µm along with 50% of silane treated BaAlSiO₂ particulate fillers into the resin. For measurement of polymerisation shrinkage strain, an array of three fibre Bragg grating sensors was embedded at a depth of 1 mm into a circular Teflon mould of diameter 15 mm and depth 2 mm. The results obtained are compared with the traditional method for evaluation of the volumetric shrinkage using density-based measurements. Degree of conversion was measured using FTIR spectroscopy (Spotlight 400 FT-IR from PerkinElmer). It is expected that the average polymerisation shrinkage strain values for dental composites reinforced with micro-sized fibres can directly correlate with the measured degree of conversion values, implying that more C=C double bond conversion to C-C single bond values also leads to higher shrinkage strain within the composite. Moreover, it could be established the photonics approach could help assess the shrinkage at any point of interest in the material, suggesting that fibre-Bragg grating sensors are a suitable means for measuring real-time polymerisation shrinkage strain for randomly fibre reinforced dental composites as well.Keywords: dental composite, glass fibre, polymerisation shrinkage strain, fibre-Bragg grating sensors
Procedia PDF Downloads 15921230 A Highly Efficient Broadcast Algorithm for Computer Networks
Authors: Ganesh Nandakumaran, Mehmet Karaata
Abstract:
A wave is a distributed execution, often made up of a broadcast phase followed by a feedback phase, requiring the participation of all the system processes before a particular event called decision is taken. Wave algorithms with one initiator such as the 1-wave algorithm have been shown to be very efficient for broadcasting messages in tree networks. Extensions of this algorithm broadcasting a sequence of waves using a single initiator have been implemented in algorithms such as the m-wave algorithm. However as the network size increases, having a single initiator adversely affects the message delivery times to nodes further away from the initiator. As a remedy, broadcast waves can be allowed to be initiated by multiple initiator nodes distributed across the network to reduce the completion time of broadcasts. These waves initiated by one or more initiator processes form a collection of waves covering the entire network. Solutions to global-snapshots, distributed broadcast and various synchronization problems can be solved efficiently using waves with multiple concurrent initiators. In this paper, we propose the first stabilizing multi-wave sequence algorithm implementing waves started by multiple initiator processes such that every process in the network receives at least one sequence of broadcasts. Due to being stabilizing, the proposed algorithm can withstand transient faults and do not require initialization. We view a fault as a transient fault if it perturbs the configuration of the system but not its program.Keywords: distributed computing, multi-node broadcast, propagation of information with feedback and cleaning (PFC), stabilization, wave algorithms
Procedia PDF Downloads 50721229 Effective Learning and Testing Methods in School-Aged Children
Authors: Farzaneh Badinlou, Reza Kormi-Nouri, Monika Knopf, Kamal Kharrazi
Abstract:
When we teach, we have two critical elements at our disposal to help students: learning styles as well as testing styles. There are many different ways in which educators can effectively teach their students; verbal learning and experience-based learning. Lecture as a form of verbal learning style is a traditional arrangement in which teachers are more active and share information verbally with students. In experienced-based learning as the process of through, students learn actively through hands-on learning materials and observing teachers or others. Meanwhile, standard testing or assessment is the way to determine progress toward proficiency. Teachers and instructors mainly use essay (requires written responses), multiple choice questions (includes the correct answer and several incorrect answers as distractors), or open-ended questions (respondents answers it with own words). The current study focused on exploring an effective teaching style and testing methods as the function of age over school ages. In the present study, totally 410 participants were selected randomly from four grades (2ⁿᵈ, 4ᵗʰ, 6ᵗʰ, and 8ᵗʰ). Each subject was tested individually in one session lasting around 50 minutes. In learning tasks, the participants were presented three different instructions for learning materials (learning by doing, learning by observing, and learning by listening). Then, they were tested via different standard assessments as free recall, cued recall, and recognition tasks. The results revealed that generally students remember more of what they do and what they observe than what they hear. The age effect was more pronounced in learning by doing than in learning by observing, and learning by listening, becoming progressively stronger in the free-recall, cued-recall, and recognition tasks. The findings of this study indicated that learning by doing and free recall task is more age sensitive, suggesting that both of them are more strategic and more affected by developmental differences. Pedagogically, these results denoted that learning by modeling and engagement in program activities have the special role for learning. Moreover, the findings indicated that the multiple-choice questions can produce the best performance for school-aged children but is less age-sensitive. By contrast, the essay as essay can produce the lowest performance but is more age-sensitive. It will be very helpful for educators to know that what types of learning styles and test methods are most effective for students in each school grade.Keywords: experience-based learning, learning style, school-aged children, testing methods, verbal learning
Procedia PDF Downloads 20721228 Linear Regression Estimation of Tactile Comfort for Denim Fabrics Based on In-Plane Shear Behavior
Authors: Nazli Uren, Ayse Okur
Abstract:
Tactile comfort of a textile product is an essential property and a major concern when it comes to customer perceptions and preferences. The subjective nature of comfort and the difficulties regarding the simulation of human hand sensory feelings make it hard to establish a well-accepted link between tactile comfort and objective evaluations. On the other hand, shear behavior of a fabric is a mechanical parameter which can be measured by various objective test methods. The principal aim of this study is to determine the tactile comfort of commercially available denim fabrics by subjective measurements, create a tactile score database for denim fabrics and investigate the relations between tactile comfort and shear behavior. In-plane shear behaviors of 17 different commercially available denim fabrics with a variety of raw material and weave structure were measured by a custom design shear frame and conventional bias extension method in two corresponding diagonal directions. Tactile comfort of denim fabrics was determined via subjective customer evaluations as well. Aforesaid relations were statistically investigated and introduced as regression equations. The analyses regarding the relations between tactile comfort and shear behavior showed that there are considerably high correlation coefficients. The suggested regression equations were likewise found out to be statistically significant. Accordingly, it was concluded that the tactile comfort of denim fabrics can be estimated with a high precision, based on the results of in-plane shear behavior measurements.Keywords: denim fabrics, in-plane shear behavior, linear regression estimation, tactile comfort
Procedia PDF Downloads 30621227 Educators’ Perceived Capacity to Create Inclusive Learning Environments: Exploring Individual Competencies and District Policy
Authors: Thuy Phan, Stephanie Luallin
Abstract:
Inclusive education policies have demonstrated benefits for students with and without disabilities in the US. There are several laws that relate to inclusive education, such as 'No Child Left Behind', 'The Individuals with Disabilities Education Act'. However, the application of these inclusive education laws and policies vary per state and school district. Classroom teachers in an inclusive classroom often experience confusion as to how to apply these policies in order to create appropriate inclusive learning environments that meet the abilities and needs of their diverse student population. The study aims to investigate teachers’ perspective of their capacities to create an appropriate learning environment for their diverse student population including students with disabilities. Qualitative method is implemented in this study, using open-end interview questions to investigate teachers’ perspective of their capacities to create an appropriate inclusive learning environment for all students based on current inclusive education laws and district policies in the state of Colorado, USA. These findings may indicate a lack of confidence in teachers’ capacity to create appropriate inclusive learning environments based on laws and district policies; including challenges that classroom teachers may experience in creating inclusive learning environments. The purpose of this study is to examine the adequate preparation of classroom teachers in creating inclusive classrooms with the intent of determining implications for developing policies in inclusive education.Keywords: educator’s capacity, inclusive education, inclusive learning environment, policy
Procedia PDF Downloads 17321226 Computational Fluid Dynamic Modeling of Mixing Enhancement by Stimulation of Ferrofluid under Magnetic Field
Authors: Neda Azimi, Masoud Rahimi, Faezeh Mohammadi
Abstract:
Computational fluid dynamics (CFD) simulation was performed to investigate the effect of ferrofluid stimulation on hydrodynamic and mass transfer characteristics of two immiscible liquid phases in a Y-micromixer. The main purpose of this work was to develop a numerical model that is able to simulate hydrodynamic of the ferrofluid flow under magnetic field and determine its effect on mass transfer characteristics. A uniform external magnetic field was applied perpendicular to the flow direction. The volume of fluid (VOF) approach was used for simulating the multiphase flow of ferrofluid and two-immiscible liquid flows. The geometric reconstruction scheme (Geo-Reconstruct) based on piecewise linear interpolation (PLIC) was used for reconstruction of the interface in the VOF approach. The mass transfer rate was defined via an equation as a function of mass concentration gradient of the transported species and added into the phase interaction panel using the user-defined function (UDF). The magnetic field was solved numerically by Fluent MHD module based on solving the magnetic induction equation method. CFD results were validated by experimental data and good agreements have been achieved, which maximum relative error for extraction efficiency was about 7.52 %. It was showed that ferrofluid actuation by a magnetic field can be considered as an efficient mixing agent for liquid-liquid two-phase mass transfer in microdevices.Keywords: CFD modeling, hydrodynamic, micromixer, ferrofluid, mixing
Procedia PDF Downloads 200