Search results for: derivative and convoluted derivative methods
13064 Development of Methods for Plastic Injection Mold Weight Reduction
Authors: Bita Mohajernia, R. J. Urbanic
Abstract:
Mold making techniques have focused on meeting the customers’ functional and process requirements; however, today, molds are increasing in size and sophistication, and are difficult to manufacture, transport, and set up due to their size and mass. Presently, mold weight saving techniques focus on pockets to reduce the mass of the mold, but the overall size is still large, which introduces costs related to the stock material purchase, processing time for process planning, machining and validation, and excess waste materials. Reducing the overall size of the mold is desirable for many reasons, but the functional requirements, tool life, and durability cannot be compromised in the process. It is proposed to use Finite Element Analysis simulation tools to model the forces, and pressures to determine where the material can be removed. The potential results of this project will reduce manufacturing costs. In this study, a light weight structure is defined by an optimal distribution of material to carry external loads. The optimization objective of this research is to determine methods to provide the optimum layout for the mold structure. The topology optimization method is utilized to improve structural stiffness while decreasing the weight using the OptiStruct software. The optimized CAD model is compared with the primary geometry of the mold from the NX software. Results of optimization show an 8% weight reduction while the actual performance of the optimized structure, validated by physical testing, is similar to the original structure.Keywords: finite element analysis, plastic injection molding, topology optimization, weight reduction
Procedia PDF Downloads 29013063 Transdisciplinarity Research Approach and Transit-Oriented Development Model for Urban Development Integration in South African Cities
Authors: Thendo Mafame
Abstract:
There is a need for academic research to focus on solving or contributing to solving real-world societal problems. Transdisciplinary research (TDR) provides a way to produce functional and applicable research findings, which can be used to advance developmental causes. This TDR study explores ways in which South Africa’s spatial divide, entrenched through decades of discriminatory planning policies, can be restructured to bring about equitable access to places of employment, business, leisure, and service for previously marginalised South Africans. It does by exploring the potential of the transit-orientated development (TOD) model to restructure and revitalise urban spaces in a collaborative model. The study focuses, through a case study, on the Du Toit station precinct in the town of Stellenbosch, on the peri-urban edge of the city of Cape Town, South Africa. The TOD model is increasingly viewed as an effective strategy for creating sustainable urban redevelopment initiatives, and it has been deployed successfully in other parts of the world. The model, which emphasises development density, diversity of land-use and infrastructure and transformative design, is customisable to a variety of country contexts. This study made use of case study approach with mixed methods to collect and analyse data. Various research methods used include the above-mentioned focus group discussions and interviews, as well as observation, transect walks This research contributes to the professional development of TDR studies that are focused on urbanisation issues.Keywords: case study, integrated urban development, land-use, stakeholder collaboration, transit-oriented development, transdisciplinary research
Procedia PDF Downloads 13213062 Border Control and Human Rights Violations: Lessons Learned from the United States and Potential Solutions for the European Union
Authors: María Elena Menéndez Ibáñez
Abstract:
After the terrorist attacks of 9/11, new measures were adopted by powerful countries and regions like the United States and the European Union in order to safeguard their security. In 2002, the US created the Department of Homeland Security with one sole objective; to protect American soil and people. The US adopted new policies that made every immigrant a potential terrorist and a threat to their national security. Stronger border control became one of the key elements of the fight against organized crime and terrorism. The main objective of this paper is to compare some of the most important and radical measures adopted by the US, even those that resulted in systematic violations of human rights, with some of the European measures adopted after the 2015 Paris attacks of 2015, such as unlawful detainment of prisoners and other measures against foreigners. Through the Schengen agreement, the European Union has tried to eliminate tariffs and border controls, in order to guarantee successful economic growth. Terrorists have taken advantage of this and have made the region vulnerable to attacks. Authorities need to strengthen their surveillance methods in order to safeguard the region and its stability. Through qualitative methods applied to social sciences, this research will also try to explain why some of the mechanisms proven to be useful in the US would not be so in Europe, especially because they would result in human rights violations. Finally, solutions will be offered that would not put the whole Schengen Agreement at risk. Europe cannot reinstate border control, without making individuals vulnerable to human rights violations.Keywords: border control, immigration, international cooperation, national security
Procedia PDF Downloads 13813061 Count of Trees in East Africa with Deep Learning
Authors: Nubwimana Rachel, Mugabowindekwe Maurice
Abstract:
Trees play a crucial role in maintaining biodiversity and providing various ecological services. Traditional methods of counting trees are time-consuming, and there is a need for more efficient techniques. However, deep learning makes it feasible to identify the multi-scale elements hidden in aerial imagery. This research focuses on the application of deep learning techniques for tree detection and counting in both forest and non-forest areas through the exploration of the deep learning application for automated tree detection and counting using satellite imagery. The objective is to identify the most effective model for automated tree counting. We used different deep learning models such as YOLOV7, SSD, and UNET, along with Generative Adversarial Networks to generate synthetic samples for training and other augmentation techniques, including Random Resized Crop, AutoAugment, and Linear Contrast Enhancement. These models were trained and fine-tuned using satellite imagery to identify and count trees. The performance of the models was assessed through multiple trials; after training and fine-tuning the models, UNET demonstrated the best performance with a validation loss of 0.1211, validation accuracy of 0.9509, and validation precision of 0.9799. This research showcases the success of deep learning in accurate tree counting through remote sensing, particularly with the UNET model. It represents a significant contribution to the field by offering an efficient and precise alternative to conventional tree-counting methods.Keywords: remote sensing, deep learning, tree counting, image segmentation, object detection, visualization
Procedia PDF Downloads 7113060 The Influence of Environmental Factors on Honey Bee Activities: A Quantitative Analysis
Authors: Hung-Jen Lin, Chien-Hao Wang, Chien-Peng Huang, Yu-Sheng Tseng, En-Cheng Yang, Joe-Air Jiang
Abstract:
Bees’ incoming and outgoing behavior is a decisive index which can indicate the health condition of a colony. Traditional methods for monitoring the behavior of honey bees (Apis mellifera) take too much time and are highly labor-intensive, and the lack of automation and synchronization disables researchers and beekeepers from obtaining real-time information of beehives. To solve these problems, this study proposes to use an Internet of Things (IoT)-based system for counting honey bees’ incoming and outgoing activities using an infrared interruption technique, while environmental factors are recorded simultaneously. The accuracy of the established system is verified by comparing the counting results with the outcomes of manual counting. Moreover, this highly -accurate device is appropriate for providing quantitative information regarding honey bees’ incoming and outgoing behavior. Different statistical analysis methods, including one-way ANOVA and two-way ANOVA, are used to investigate the influence of environmental factors, such as temperature, humidity, illumination and ambient pressure, on bees’ incoming and outgoing behavior. With the real-time data, a standard model is established using the outcomes from analyzing the relationship between environmental factors and bees’ incoming and outgoing behavior. In the future, smart control systems, such as a temperature control system, can also be combined with the proposed system to create an appropriate colony environment. It is expected that the proposed system will make a considerable contribution to the apiculture and researchers.Keywords: ANOVA, environmental factors, honey bee, incoming and outgoing behavior
Procedia PDF Downloads 36813059 Performance Evaluation and Comparison between the Empirical Mode Decomposition, Wavelet Analysis, and Singular Spectrum Analysis Applied to the Time Series Analysis in Atmospheric Science
Authors: Olivier Delage, Hassan Bencherif, Alain Bourdier
Abstract:
Signal decomposition approaches represent an important step in time series analysis, providing useful knowledge and insight into the data and underlying dynamics characteristics while also facilitating tasks such as noise removal and feature extraction. As most of observational time series are nonlinear and nonstationary, resulting of several physical processes interaction at different time scales, experimental time series have fluctuations at all time scales and requires the development of specific signal decomposition techniques. Most commonly used techniques are data driven, enabling to obtain well-behaved signal components without making any prior-assumptions on input data. Among the most popular time series decomposition techniques, most cited in the literature, are the empirical mode decomposition and its variants, the empirical wavelet transform and singular spectrum analysis. With increasing popularity and utility of these methods in wide ranging applications, it is imperative to gain a good understanding and insight into the operation of these algorithms. In this work, we describe all of the techniques mentioned above as well as their ability to denoise signals, to capture trends, to identify components corresponding to the physical processes involved in the evolution of the observed system and deduce the dimensionality of the underlying dynamics. Results obtained with all of these methods on experimental total ozone columns and rainfall time series will be discussed and comparedKeywords: denoising, empirical mode decomposition, singular spectrum analysis, time series, underlying dynamics, wavelet analysis
Procedia PDF Downloads 11713058 The Impact of Window Opening Occupant Behavior Models on Building Energy Performance
Authors: Habtamu Tkubet Ebuy
Abstract:
Purpose Conventional dynamic energy simulation tools go beyond the static dimension of simplified methods by providing better and more accurate prediction of building performance. However, their ability to forecast actual performance is undermined by a low representation of human interactions. The purpose of this study is to examine the potential benefits of incorporating information on occupant diversity into occupant behavior models used to simulate building performance. The co-simulation of the stochastic behavior of the occupants substantially increases the accuracy of the simulation. Design/methodology/approach In this article, probabilistic models of the "opening and closing" behavior of the window of inhabitants have been developed in a separate multi-agent platform, SimOcc, and implemented in the building simulation, TRNSYS, in such a way that the behavior of the window with the interconnectivity can be reflected in the simulation analysis of the building. Findings The results of the study prove that the application of complex behaviors is important to research in predicting actual building performance. The results aid in the identification of the gap between reality and existing simulation methods. We hope this study and its results will serve as a guide for researchers interested in investigating occupant behavior in the future. Research limitations/implications Further case studies involving multi-user behavior for complex commercial buildings need to more understand the impact of the occupant behavior on building performance. Originality/value This study is considered as a good opportunity to achieve the national strategy by showing a suitable tool to help stakeholders in the design phase of new or retrofitted buildings to improve the performance of office buildings.Keywords: occupant behavior, co-simulation, energy consumption, thermal comfort
Procedia PDF Downloads 10413057 A Ground Structure Method to Minimize the Total Installed Cost of Steel Frame Structures
Authors: Filippo Ranalli, Forest Flager, Martin Fischer
Abstract:
This paper presents a ground structure method to optimize the topology and discrete member sizing of steel frame structures in order to minimize total installed cost, including material, fabrication and erection components. The proposed method improves upon existing cost-based ground structure methods by incorporating constructability considerations well as satisfying both strength and serviceability constraints. The architecture for the method is a bi-level Multidisciplinary Feasible (MDF) architecture in which the discrete member sizing optimization is nested within the topology optimization process. For each structural topology generated, the sizing optimization process seek to find a set of discrete member sizes that result in the lowest total installed cost while satisfying strength (member utilization) and serviceability (node deflection and story drift) criteria. To accurately assess cost, the connection details for the structure are generated automatically using accurate site-specific cost information obtained directly from fabricators and erectors. Member continuity rules are also applied to each node in the structure to improve constructability. The proposed optimization method is benchmarked against conventional weight-based ground structure optimization methods resulting in an average cost savings of up to 30% with comparable computational efficiency.Keywords: cost-based structural optimization, cost-based topology and sizing, optimization, steel frame ground structure optimization, multidisciplinary optimization of steel structures
Procedia PDF Downloads 34113056 Azadrachea indica Leaves Extract Assisted Green Synthesis of Ag-TiO₂ for Degradation of Dyes in Aqueous Medium
Authors: Muhammad Saeed, Sheeba Khalid
Abstract:
Aqueous pollution due to the textile industry is an important issue. Photocatalysis using metal oxides as catalysts is one of the methods used for eradication of dyes from textile industrial effluents. In this study, the synthesis, characterization, and evaluation of photocatalytic activity of Ag-TiO₂ are reported. TiO₂ catalysts with 2, 4, 6 and 8% loading of Ag were prepared by green methods using Azadrachea indica leaves' extract as reducing agent and titanium dioxide and silver nitrate as precursor materials. The 4% Ag-TiO₂ exhibited the best catalytic activity for degradation of dyes. Prepared catalyst was characterized by advanced techniques. Catalytic degradation of methylene blue and rhodamine B were carried out in Pyrex glass batch reactor. Deposition of Ag greatly enhanced the catalytic efficiency of TiO₂ towards degradation of dyes. Irradiation of catalyst excites electrons from conduction band of catalyst to valence band yielding an electron-hole pair. These photoexcited electrons and positive hole undergo secondary reaction and produce OH radicals. These active radicals take part in the degradation of dyes. More than 90% of dyes were degraded in 120 minutes. It was found that there was no loss catalytic efficiency of prepared Ag-TiO₂ after recycling it for two times. Photocatalytic degradation of methylene blue and rhodamine B followed Eley-Rideal mechanism which states that dye reacts in fluid phase with adsorbed oxygen. 27 kJ/mol and 20 kJ/mol were found as activation energy for photodegradation of methylene blue and rhodamine B dye respectively.Keywords: TiO₂, Ag-TiO₂, methylene blue, Rhodamine B., photo degradation
Procedia PDF Downloads 16513055 Methodological Proposal, Archival Thesaurus in Colombian Sign Language
Authors: Pedro A. Medina-Rios, Marly Yolie Quintana-Daza
Abstract:
Having the opportunity to communicate in a social, academic and work context is very relevant for any individual and more for a deaf person when oral language is not their natural language, and written language is their second language. Currently, in Colombia, there is not a specialized dictionary for our best knowledge in sign language archiving. Archival is one of the areas that the deaf community has a greater chance of performing. Nourishing new signs in dictionaries for deaf people extends the possibility that they have the appropriate signs to communicate and improve their performance. The aim of this work was to illustrate the importance of designing pedagogical and technological strategies of knowledge management, for the academic inclusion of deaf people through proposals of lexicon in Colombian sign language (LSC) in the area of archival. As a method, the analytical study was used to identify relevant words in the technical area of the archival and its counterpart with the LSC, 30 deaf people, apprentices - students of the Servicio Nacional de Aprendizaje (SENA) in Documentary or Archival Management programs, were evaluated through direct interviews in LSC. For the analysis tools were maintained to evaluate correlation patterns and linguistic methods of visual, gestural analysis and corpus; besides, methods of linear regression were used. Among the results, significant data were found among the variables socioeconomic stratum, academic level, labor location. The need to generate new signals on the subject of the file to improve communication between the deaf person, listener and the sign language interpreter. It is concluded that the generation of new signs to nourish the LSC dictionary in archival subjects is necessary to improve the labor inclusion of deaf people in Colombia.Keywords: archival, inclusion, deaf, thesaurus
Procedia PDF Downloads 27813054 Sudden Death and Chronic Disseminated Intravascular Coagulation (DIC): Two Case Reports
Authors: Saker Lilia, Youcef Mellouki, Lakhdar Sellami, Yacine Zerairia, Abdelhaid Zetili, Fatma Guahria, Fateh Kaious, Nesrine Belkhodja, Abdelhamid Mira
Abstract:
Background: Sudden death is regarded as a suspicious demise necessitating autopsy, as stipulated by legal authorities. Chronic disseminated intravascular coagulation (DIC) is an acquired clinical and biological syndrome characterized by a severe and fatal prognosis, stemming from systemic, uncontrolled, diffuse coagulation activation. Irrespective of their origins, DIC is associated with a diverse spectrum of manifestations, encompassing minor biological coagulation alterations to profoundly severe conditions wherein hemorrhagic complications may take precedence. Simultaneously, microthrombi contribute to the development of multi-organ failures. Objective This study seeks to evaluate the role of autopsy in determining the causes of death. Materials and Methods: We present two instances of sudden death involving females who underwent autopsy at the Forensic Medicine Department of the University Hospital of Annaba, Algeria. These autopsies were performed at the request of the prosecutor, aiming to determine the causes of death and illuminate the exact circumstances surrounding it. Methods Utilized: Analysis of the initial information report; Findings from postmortem examinations; Histological assessments and toxicological analyses. Results: The presence of DIC was noted, affecting nearly all veins with distinct etiologies. Conclusion: For the establishment of a meaningful diagnosis: • Thorough understanding of the subject matter is imperative; • Precise alignment with medicolegal data is essential.Keywords: chronic disseminated intravascular coagulation, sudden death, autopsy, causes of death
Procedia PDF Downloads 8513053 A Robust and Efficient Segmentation Method Applied for Cardiac Left Ventricle with Abnormal Shapes
Authors: Peifei Zhu, Zisheng Li, Yasuki Kakishita, Mayumi Suzuki, Tomoaki Chono
Abstract:
Segmentation of left ventricle (LV) from cardiac ultrasound images provides a quantitative functional analysis of the heart to diagnose disease. Active Shape Model (ASM) is a widely used approach for LV segmentation but suffers from the drawback that initialization of the shape model is not sufficiently close to the target, especially when dealing with abnormal shapes in disease. In this work, a two-step framework is proposed to improve the accuracy and speed of the model-based segmentation. Firstly, a robust and efficient detector based on Hough forest is proposed to localize cardiac feature points, and such points are used to predict the initial fitting of the LV shape model. Secondly, to achieve more accurate and detailed segmentation, ASM is applied to further fit the LV shape model to the cardiac ultrasound image. The performance of the proposed method is evaluated on a dataset of 800 cardiac ultrasound images that are mostly of abnormal shapes. The proposed method is compared to several combinations of ASM and existing initialization methods. The experiment results demonstrate that the accuracy of feature point detection for initialization was improved by 40% compared to the existing methods. Moreover, the proposed method significantly reduces the number of necessary ASM fitting loops, thus speeding up the whole segmentation process. Therefore, the proposed method is able to achieve more accurate and efficient segmentation results and is applicable to unusual shapes of heart with cardiac diseases, such as left atrial enlargement.Keywords: hough forest, active shape model, segmentation, cardiac left ventricle
Procedia PDF Downloads 33913052 Heuristics for Optimizing Power Consumption in the Smart Grid
Authors: Zaid Jamal Saeed Almahmoud
Abstract:
Our increasing reliance on electricity, with inefficient consumption trends, has resulted in several economical and environmental threats. These threats include wasting billions of dollars, draining limited resources, and elevating the impact of climate change. As a solution, the smart grid is emerging as the future power grid, with smart techniques to optimize power consumption and electricity generation. Minimizing the peak power consumption under a fixed delay requirement is a significant problem in the smart grid. In addition, matching demand to supply is a key requirement for the success of the future electricity. In this work, we consider the problem of minimizing the peak demand under appliances constraints by scheduling power jobs with uniform release dates and deadlines. As the problem is known to be NP-Hard, we propose two versions of a heuristic algorithm for solving this problem. Our theoretical analysis and experimental results show that our proposed heuristics outperform existing methods by providing a better approximation to the optimal solution. In addition, we consider dynamic pricing methods to minimize the peak load and match demand to supply in the smart grid. Our contribution is the proposal of generic, as well as customized pricing heuristics to minimize the peak demand and match demand with supply. In addition, we propose optimal pricing algorithms that can be used when the maximum deadline period of the power jobs is relatively small. Finally, we provide theoretical analysis and conduct several experiments to evaluate the performance of the proposed algorithms.Keywords: heuristics, optimization, smart grid, peak demand, power supply
Procedia PDF Downloads 8813051 The Reliability of Wireless Sensor Network
Authors: Bohuslava Juhasova, Igor Halenar, Martin Juhas
Abstract:
The wireless communication is one of the widely used methods of data transfer at the present days. The benefit of this communication method is the partial independence of the infrastructure and the possibility of mobility. In some special applications it is the only way how to connect. This paper presents some problems in the implementation of a sensor network connection for measuring environmental parameters in the area of manufacturing plants.Keywords: network, communication, reliability, sensors
Procedia PDF Downloads 65213050 Climate Change and Sustainable Development among Agricultural Communities in Tanzania; An Analysis of Southern Highland Rural Communities
Authors: Paschal Arsein Mugabe
Abstract:
This paper examines sustainable development planning in the context of environmental concerns in rural areas of the Tanzania. It challenges mainstream approaches to development, focusing instead upon transformative action for environmental justice. The goal is to help shape future sustainable development agendas in local government, international agencies and civil society organisations. Research methods: The approach of the study is geographical, but also involves various Trans-disciplinary elements, particularly from development studies, sociology and anthropology, management, geography, agriculture and environmental science. The research methods included thematic and questionnaire interviews, participatory tools such as focus group discussion, participatory research appraisal and expert interviews for primary data. Secondary data were gathered through the analysis of land use/cover data and official documents on climate, agriculture, marketing and health. Also several earlier studies that were made in the area provided an important reference base. Findings: The findings show that, agricultural sustainability in Tanzania appears likely to deteriorate as a consequence of climate change. Noteworthy differences in impacts across households are also present both by district and by income category. Also food security cannot be explained by climate as the only influencing factor. A combination of economic, political and socio-cultural context of the community are crucial. Conclusively, it is worthy knowing that people understand their relationship between climate change and their livelihood.Keywords: agriculture, climate change, environment, sustainable development
Procedia PDF Downloads 32513049 Building Information Modelling: A Solution to the Limitations of Prefabricated Construction
Authors: Lucas Peries, Rolla Monib
Abstract:
The construction industry plays a vital role in the global economy, contributing billions of dollars annually. However, the industry has been struggling with persistently low productivity levels for years, unlike other sectors that have shown significant improvements. Modular and prefabricated construction methods have been identified as potential solutions to boost productivity in the construction industry. These methods offer time advantages over traditional construction methods. Despite their potential benefits, modular and prefabricated construction face hindrances and limitations that are not present in traditional building systems. Building information modelling (BIM) has the potential to address some of these hindrances, but barriers are preventing its widespread adoption in the construction industry. This research aims to enhance understanding of the shortcomings of modular and prefabricated building systems and develop BIM-based solutions to alleviate or eliminate these hindrances. The research objectives include identifying and analysing key issues hindering the use of modular and prefabricated building systems, investigating the current state of BIM adoption in the construction industry and factors affecting its successful implementation, proposing BIM-based solutions to address the issues associated with modular and prefabricated building systems, and assessing the effectiveness of the developed solutions in removing barriers to their use. The research methodology involves conducting a critical literature review to identify the key issues and challenges in modular and prefabricated construction and BIM adoption. Additionally, an online questionnaire will be used to collect primary data from construction industry professionals, allowing for feedback and evaluation of the proposed BIM-based solutions. The data collected will be analysed to evaluate the effectiveness of the solutions and their potential impact on the adoption of modular and prefabricated building systems. The main findings of the research indicate that the identified issues from the literature review align with the opinions of industry professionals, and the proposed BIM-based solutions are considered effective in addressing the challenges associated with modular and prefabricated construction. However, the research has limitations, such as a small sample size and the need to assess the feasibility of implementing the proposed solutions. In conclusion, this research contributes to enhancing the understanding of modular and prefabricated building systems' limitations and proposes BIM-based solutions to overcome these limitations. The findings are valuable to construction industry professionals and BIM software developers, providing insights into the challenges and potential solutions for implementing modular and prefabricated construction systems in future projects. Further research should focus on addressing the limitations and assessing the feasibility of implementing the proposed solutions from technical and legal perspectives.Keywords: building information modelling, modularisation, prefabrication, technology
Procedia PDF Downloads 9813048 Management of ASD with Co-Morbid OCD: A Literature Review to Compare the Pharmacological and Psychological Treatment Options in Individuals Under the Age of 18
Authors: Melissa Nelson, Simran Jandu, Hana Jalal, Mia Ingram, Chrysi Stefanidou
Abstract:
There is a significant overlap between autism spectrum disorder (ASD) and obsessive compulsive disorder (OCD), with up to 90% of young people diagnosed with ASD having this co-morbidity. Distinguishing between the symptoms of the two leads to issues with accurate treatment, yet this is paramount in benefitting the young person. There are two distinct methods of treatment, psychological or pharmacological, with clinicians tending to choose one or the other, potentially due to the lack of research available. This report reviews the efficacy of psychological and pharmacological treatments for young people diagnosed with ASD and co-morbid OCD. A literature review was performed on papers from the last fifteen years, including “ASD,” “OCD,” and individuals under the age of 18. Eleven papers were selected as relevant. The report looks at the comparison between more traditional methods, such as selective serotonin reuptake inhibitors (SSRI) and cognitive behavior therapy (CBT), and newer therapies, such as modified or intensive ASD-focused psychotherapies and the use of other medication classes. On reviewing the data, it was identified that there was a distinct lack of information on this important topic. The most widely used treatment was medication such as Fluoxetine, an SSRI, which rarely showed an improvement in symptoms or outcomes. This is in contrast to modified forms of CBT, which often reduces symptoms or even results in OCD remission. With increased research into the non-traditional management of these co-morbid conditions, it is clear there is scope that modified CBT may become the future treatment of choice for OCD in young people with ASD.Keywords: autism spectrum disorder, intensive or adapted cognitive behavioral therapy, obsessive compulsive disorder, pharmacological management
Procedia PDF Downloads 913047 Characterization of 2,4,6-Trinitrotoluene (Tnt)-Metabolizing Bacillus Cereus Sp TUHP2 Isolated from TNT-Polluted Soils in the Vellore District, Tamilnadu, India
Authors: S. Hannah Elizabeth, A. Panneerselvam
Abstract:
Objective: The main objective was to evaluate the degradative properties of Bacillus cereus sp TUHP2 isolated from TNT-Polluted soils in the Vellore District, Tamil Nadu, India. Methods: Among the 3 bacterial genera isolated from different soil samples, one potent TNT degrading strain Bacillus cereus sp TUHP2 was identified. The morphological, physiological and the biochemical properties of the strain Bacillus cereus sp TUHP2 was confirmed by conventional methods and genotypic characterization was carried out using 16S r-DNA partial gene amplification and sequencing. The broken down by products of DNT in the extract was determined by Gas Chromatogram- Mass spectrometry (GC-MS). Supernatant samples from the broth studied at 24 h interval were analyzed by HPLC analysis and the effect on various nutritional and environmental factors were analysed and optimized for the isolate. Results: Out of three isolates one strain TUHP2 were found to have potent efficiency to degrade TNT and revealed the genus Bacillus. 16S rDNA gene sequence analysis showed highest homology (98%) with Bacillus cereus and was assigned as Bacillus cereus sp TUHP2. Based on the energy of the predicted models, the secondary structure predicted by MFE showed the more stable structure with a minimum energy. Products of TNT Transformation showed colour change in the medium during cultivation. TNT derivates such as 2HADNT and 4HADNT were detected by HPLC chromatogram and 2ADNT, 4ADNT by GC/MS analysis. Conclusion: Hence this study presents the clear evidence for the biodegradation process of TNT by strain Bacillus cereus sp TUHP2.Keywords: bioremediation, biodegradation, biotransformation, sequencing
Procedia PDF Downloads 46213046 Real Estate Trend Prediction with Artificial Intelligence Techniques
Authors: Sophia Liang Zhou
Abstract:
For investors, businesses, consumers, and governments, an accurate assessment of future housing prices is crucial to critical decisions in resource allocation, policy formation, and investment strategies. Previous studies are contradictory about macroeconomic determinants of housing price and largely focused on one or two areas using point prediction. This study aims to develop data-driven models to accurately predict future housing market trends in different markets. This work studied five different metropolitan areas representing different market trends and compared three-time lagging situations: no lag, 6-month lag, and 12-month lag. Linear regression (LR), random forest (RF), and artificial neural network (ANN) were employed to model the real estate price using datasets with S&P/Case-Shiller home price index and 12 demographic and macroeconomic features, such as gross domestic product (GDP), resident population, personal income, etc. in five metropolitan areas: Boston, Dallas, New York, Chicago, and San Francisco. The data from March 2005 to December 2018 were collected from the Federal Reserve Bank, FBI, and Freddie Mac. In the original data, some factors are monthly, some quarterly, and some yearly. Thus, two methods to compensate missing values, backfill or interpolation, were compared. The models were evaluated by accuracy, mean absolute error, and root mean square error. The LR and ANN models outperformed the RF model due to RF’s inherent limitations. Both ANN and LR methods generated predictive models with high accuracy ( > 95%). It was found that personal income, GDP, population, and measures of debt consistently appeared as the most important factors. It also showed that technique to compensate missing values in the dataset and implementation of time lag can have a significant influence on the model performance and require further investigation. The best performing models varied for each area, but the backfilled 12-month lag LR models and the interpolated no lag ANN models showed the best stable performance overall, with accuracies > 95% for each city. This study reveals the influence of input variables in different markets. It also provides evidence to support future studies to identify the optimal time lag and data imputing methods for establishing accurate predictive models.Keywords: linear regression, random forest, artificial neural network, real estate price prediction
Procedia PDF Downloads 10313045 Frequent Pattern Mining for Digenic Human Traits
Authors: Atsuko Okazaki, Jurg Ott
Abstract:
Some genetic diseases (‘digenic traits’) are due to the interaction between two DNA variants. For example, certain forms of Retinitis Pigmentosa (a genetic form of blindness) occur in the presence of two mutant variants, one in the ROM1 gene and one in the RDS gene, while the occurrence of only one of these mutant variants leads to a completely normal phenotype. Detecting such digenic traits by genetic methods is difficult. A common approach to finding disease-causing variants is to compare 100,000s of variants between individuals with a trait (cases) and those without the trait (controls). Such genome-wide association studies (GWASs) have been very successful but hinge on genetic effects of single variants, that is, there should be a difference in allele or genotype frequencies between cases and controls at a disease-causing variant. Frequent pattern mining (FPM) methods offer an avenue at detecting digenic traits even in the absence of single-variant effects. The idea is to enumerate pairs of genotypes (genotype patterns) with each of the two genotypes originating from different variants that may be located at very different genomic positions. What is needed is for genotype patterns to be significantly more common in cases than in controls. Let Y = 2 refer to cases and Y = 1 to controls, with X denoting a specific genotype pattern. We are seeking association rules, ‘X → Y’, with high confidence, P(Y = 2|X), significantly higher than the proportion of cases, P(Y = 2) in the study. Clearly, generally available FPM methods are very suitable for detecting disease-associated genotype patterns. We use fpgrowth as the basic FPM algorithm and built a framework around it to enumerate high-frequency digenic genotype patterns and to evaluate their statistical significance by permutation analysis. Application to a published dataset on opioid dependence furnished results that could not be found with classical GWAS methodology. There were 143 cases and 153 healthy controls, each genotyped for 82 variants in eight genes of the opioid system. The aim was to find out whether any of these variants were disease-associated. The single-variant analysis did not lead to significant results. Application of our FPM implementation resulted in one significant (p < 0.01) genotype pattern with both genotypes in the pattern being heterozygous and originating from two variants on different chromosomes. This pattern occurred in 14 cases and none of the controls. Thus, the pattern seems quite specific to this form of substance abuse and is also rather predictive of disease. An algorithm called Multifactor Dimension Reduction (MDR) was developed some 20 years ago and has been in use in human genetics ever since. This and our algorithms share some similar properties, but they are also very different in other respects. The main difference seems to be that our algorithm focuses on patterns of genotypes while the main object of inference in MDR is the 3 × 3 table of genotypes at two variants.Keywords: digenic traits, DNA variants, epistasis, statistical genetics
Procedia PDF Downloads 12213044 Study on Adding Story and Seismic Strengthening of Old Masonry Buildings
Authors: Youlu Huang, Huanjun Jiang
Abstract:
A large number of old masonry buildings built in the last century still remain in the city. It generates the problems of unsafety, obsolescence, and non-habitability. In recent years, many old buildings have been reconstructed through renovating façade, strengthening, and adding floors. However, most projects only provide a solution for a single problem. It is difficult to comprehensively solve problems of poor safety and lack of building functions. Therefore, a comprehensive functional renovation program of adding reinforced concrete frame story at the bottom via integrally lifting the building and then strengthening the building was put forward. Based on field measurement and YJK calculation software, the seismic performance of an actual three-story masonry structure in Shanghai was identified. The results show that the material strength of masonry is low, and the bearing capacity of some masonry walls could not meet the code requirements. The elastoplastic time history analysis of the structure was carried out by using SAP2000 software. The results show that under the 7 degrees rare earthquake, the seismic performance of the structure reaches 'serious damage' performance level. Based on the code requirements of the stiffness ration of the bottom frame (lateral stiffness ration of the transition masonry story and frame story), the bottom frame story was designed. The integral lifting process of the masonry building was introduced based on many engineering examples. The reinforced methods for the bottom frame structure strengthened by the steel-reinforced mesh mortar surface layer (SRMM) and base isolators, respectively, were proposed. The time history analysis of the two kinds of structures, under the frequent earthquake, the fortification earthquake, and the rare earthquake, was conducted by SAP2000 software. For the bottom frame structure, the results show that the seismic response of the masonry floor is significantly reduced after reinforced by the two methods compared to the masonry structure. The previous earthquake disaster indicated that the bottom frame is vulnerable to serious damage under a strong earthquake. The analysis results showed that under the rare earthquake, the inter-story displacement angle of the bottom frame floor meets the 1/100 limit value of the seismic code. The inter-story drift of the masonry floor for the base isolated structure under different levels of earthquakes is similar to that of structure with SRMM, while the base-isolated program is better to protect the bottom frame. Both reinforced methods could significantly improve the seismic performance of the bottom frame structure.Keywords: old buildings, adding story, seismic strengthening, seismic performance
Procedia PDF Downloads 12113043 Design and Implementation of Low-code Model-building Methods
Authors: Zhilin Wang, Zhihao Zheng, Linxin Liu
Abstract:
This study proposes a low-code model-building approach that aims to simplify the development and deployment of artificial intelligence (AI) models. With an intuitive way to drag and drop and connect components, users can easily build complex models and integrate multiple algorithms for training. After the training is completed, the system automatically generates a callable model service API. This method not only lowers the technical threshold of AI development and improves development efficiency but also enhances the flexibility of algorithm integration and simplifies the deployment process of models. The core strength of this method lies in its ease of use and efficiency. Users do not need to have a deep programming background and can complete the design and implementation of complex models with a simple drag-and-drop operation. This feature greatly expands the scope of AI technology, allowing more non-technical people to participate in the development of AI models. At the same time, the method performs well in algorithm integration, supporting many different types of algorithms to work together, which further improves the performance and applicability of the model. In the experimental part, we performed several performance tests on the method. The results show that compared with traditional model construction methods, this method can make more efficient use, save computing resources, and greatly shorten the model training time. In addition, the system-generated model service interface has been optimized for high availability and scalability, which can adapt to the needs of different application scenarios.Keywords: low-code, model building, artificial intelligence, algorithm integration, model deployment
Procedia PDF Downloads 3013042 Prevalence of Knee Pain and Risk Factors and Its Impact on Functional Impairment among Saudi Adolescents
Authors: Ali H.Alyami, Hussam Darraj, Faisal Hakami, Mohammed Awaf, Sulaiman Hamdi, Nawaf Bakri, Abdulaziz Saber, Khalid Hakami, Almuhanad Alyami, Mohammed khashab
Abstract:
Introduction: Adolescents frequently self-report pain, according to epidemiological research. The knee is one of the sites where the pain is most common. One of the main factors contributing to the number of years people spend disabled and having substantial personal, societal, and economic burdens globally are musculoskeletal disorders. Adolescents may have knee pain due to an abrupt, traumatic injury or an insidious, slowly building onset that neither the adolescent nor the parent is aware of. Objectives: The present study’s authors aimed to estimate the prevalence of knee pain in Saudi adolescents. Methods: This cross-sectional survey, carried out from June to November 2022, included 676 adolescents ages 10 to 18. Data are presented as frequencies and percentages for categorical variables. Analysis of variance (ANOVA) was used to compare means between groups, while the chi-square test was used for the comparison of categorical variables. Statistical significance was set at P< 0.05.Result: Adolescents were invited to take part in the study. 57.5% were girls, and 42.5% were males,68.8% were 676 aged between 15 and 18. The prevalence of knee pain was considerably high among females (26%), while it was 19.2% among males. Moreover, age was a significant predictor for knee pain; also BMI was significant for knee pain. Conclusion: Our study noted a high rate of knee pain among adolescents, so we need to raise awareness about risk factors. Adolescent knee pain can be prevented with conservative methods and some minor lifestyle/activity modifications.Keywords: knee pain, prevalence of knee pain, exercise training, physical activity
Procedia PDF Downloads 11113041 Improvement of Visual Acuity in Patient Undergoing Occlusion Therapy
Authors: Rajib Husain, Mezbah Uddin, Mohammad Shamsal Islam, Rabeya Siddiquee
Abstract:
Purpose: To determine the improvement of visual acuity in patients undergoing occlusion therapy. Methods: This was a prospective hospital-based study of newly diagnosed of amblyopia seen at the pediatric clinic of Chittagong Eye Infirmary & Training Complex. There were 32 refractive amblyopia subjects were examined & questionnaire was piloted. Included were all patients diagnosed with refractive amblyopia between 5 to 8 years, without previous amblyopia treatment, and whose parents were interested to participate in the study. Patients diagnosed with strabismic amblyopia were excluded. Patients were first corrected with the best correction for a month. When the VA in the amblyopic eye did not improve over a month, then occlusion treatment was started. Occlusion was done daily for 6-8 h together with vision therapy. The occlusion was carried out for three months. Results: Out of study 32 children, 31 of them have a good compliance of amblyopic treatment whereas one child has poor compliance. About 6% Children have amblyopia from Myopia, 7% Hyperopia, 32% from myopic astigmatism, 42% from hyperopic astigmatism and 13% have mixed astigmatism. The mean and Standard deviation of present average VA was 0.452±0.275 Log MAR and after an intervention of amblyopia therapy with vision therapy mean and Standard deviation VA was 0.155±0.157 Log MAR. Out of total respondent 21.85% have BCVA in range from (0-.2) log MAR, 37.5% have BCVA in range from (0.22-0.5) log MAR, 35.95% have in range from (0.52-0.8) log MAR, 4.7% have in range from (0.82-1) log MAR and after intervention of occlusion therapy with vision therapy 76.6% have VA in range from (0-.2) log MAR, 21.85% have VA in range from (0.22-0.5) log MAR, 1.5% have in range from (0.52-0.8) log MAR. Conclusion: Amblyopia is a most important factor in pediatric age group because it can lead to visual impairment. Thus, this study concludes that occlusion therapy with vision therapy is probably one of the best treatment methods for amblyopic patients (age 5-8 years), and compliance and age were the most critical factor predicting a successful outcome.Keywords: amblyopia, occlusion therapy, vision therapy, eccentric fixation, visuoscopy
Procedia PDF Downloads 50313040 Code Embedding for Software Vulnerability Discovery Based on Semantic Information
Authors: Joseph Gear, Yue Xu, Ernest Foo, Praveen Gauravaran, Zahra Jadidi, Leonie Simpson
Abstract:
Deep learning methods have been seeing an increasing application to the long-standing security research goal of automatic vulnerability detection for source code. Attention, however, must still be paid to the task of producing vector representations for source code (code embeddings) as input for these deep learning models. Graphical representations of code, most predominantly Abstract Syntax Trees and Code Property Graphs, have received some use in this task of late; however, for very large graphs representing very large code snip- pets, learning becomes prohibitively computationally expensive. This expense may be reduced by intelligently pruning this input to only vulnerability-relevant information; however, little research in this area has been performed. Additionally, most existing work comprehends code based solely on the structure of the graph at the expense of the information contained by the node in the graph. This paper proposes Semantic-enhanced Code Embedding for Vulnerability Discovery (SCEVD), a deep learning model which uses semantic-based feature selection for its vulnerability classification model. It uses information from the nodes as well as the structure of the code graph in order to select features which are most indicative of the presence or absence of vulnerabilities. This model is implemented and experimentally tested using the SARD Juliet vulnerability test suite to determine its efficacy. It is able to improve on existing code graph feature selection methods, as demonstrated by its improved ability to discover vulnerabilities.Keywords: code representation, deep learning, source code semantics, vulnerability discovery
Procedia PDF Downloads 15813039 Concept of a Pseudo-Lower Bound Solution for Reinforced Concrete Slabs
Authors: M. De Filippo, J. S. Kuang
Abstract:
In construction industry, reinforced concrete (RC) slabs represent fundamental elements of buildings and bridges. Different methods are available for analysing the structural behaviour of slabs. In the early ages of last century, the yield-line method has been proposed to attempt to solve such problem. Simple geometry problems could easily be solved by using traditional hand analyses which include plasticity theories. Nowadays, advanced finite element (FE) analyses have mainly found their way into applications of many engineering fields due to the wide range of geometries to which they can be applied. In such cases, the application of an elastic or a plastic constitutive model would completely change the approach of the analysis itself. Elastic methods are popular due to their easy applicability to automated computations. However, elastic analyses are limited since they do not consider any aspect of the material behaviour beyond its yield limit, which turns to be an essential aspect of RC structural performance. Furthermore, their applicability to non-linear analysis for modeling plastic behaviour gives very reliable results. Per contra, this type of analysis is computationally quite expensive, i.e. not well suited for solving daily engineering problems. In the past years, many researchers have worked on filling this gap between easy-to-implement elastic methods and computationally complex plastic analyses. This paper aims at proposing a numerical procedure, through which a pseudo-lower bound solution, not violating the yield criterion, is achieved. The advantages of moment distribution are taken into account, hence the increase in strength provided by plastic behaviour is considered. The lower bound solution is improved by detecting over-yielded moments, which are used to artificially rule the moment distribution among the rest of the non-yielded elements. The proposed technique obeys Nielsen’s yield criterion. The outcome of this analysis provides a simple, yet accurate, and non-time-consuming tool of predicting the lower-bound solution of the collapse load of RC slabs. By using this method, structural engineers can find the fracture patterns and ultimate load bearing capacity. The collapse triggering mechanism is found by detecting yield-lines. An application to the simple case of a square clamped slab is shown, and a good match was found with the exact values of collapse load.Keywords: computational mechanics, lower bound method, reinforced concrete slabs, yield-line
Procedia PDF Downloads 17813038 Evaluation of Virtual Reality for the Rehabilitation of Athlete Lower Limb Musculoskeletal Injury: A Method for Obtaining Practitioner’s Viewpoints through Observation and Interview
Authors: Hannah K. M. Tang, Muhammad Ateeq, Mark J. Lake, Badr Abdullah, Frederic A. Bezombes
Abstract:
Based on a theoretical assessment of current literature, virtual reality (VR) could help to treat sporting injuries in a number of ways. However, it is important to obtain rehabilitation specialists’ perspectives in order to design, develop and validate suitable content for a VR application focused on treatment. Subsequently, a one-day observation and interview study focused on the use of VR for the treatment of lower limb musculoskeletal conditions in athletes was conducted at St George’s Park England National Football Centre with rehabilitation specialists. The current paper established the methods suitable for obtaining practitioner’s viewpoints through observation and interview in this context. Particular detail was provided regarding the method of qualitatively processing interview results using the qualitative data analysis software tool NVivo, in order to produce a narrative of overarching themes. The observations and overarching themes identified could be used as a framework and success criteria of a VR application developed in future research. In conclusion, this work explained the methods deemed suitable for obtaining practitioner’s viewpoints through observation and interview. This was required in order to highlight characteristics and features of a VR application designed to treat lower limb musculoskeletal injury of athletes and could be built upon to direct future work.Keywords: athletes, lower-limb musculoskeletal injury, rehabilitation, return-to-sport, virtual reality
Procedia PDF Downloads 25713037 Mayan Culture and Attitudes towards Sustainability
Authors: Sarah Ryu
Abstract:
Agricultural methods and ecological approaches employed by the pre-colonial Mayans may provide valuable insights into forest management and viable alternatives for resource sustainability in the face of major deforestation across Central and South America.Using a combination of observation data collected from the modern indigenous inhabitants near Mixco in Guatemala and historical data, this study was able to create a holistic picture of how the Maya maintained their ecosystems. Surveys and observations were conducted in the field, over a period of twelve weeks across two years. Geographic and archaeological data for this area was provided by Guatemalan organizations such as the Universidad de San Carlos de Guatemala. Observations of current indigenous populations around Mixco showed that they adhered to traditional Mayan methods of agriculture, such as terrace construction and arboriculture. Rather than planting one cash crop as was done by the Spanish, indigenous peoples practice agroforestry, cultivating forests that would provide trees for construction material, wild plant foods, habitat for game, and medicinal herbs. The emphasis on biodiversity prevented deforestation and created a sustainable balance between human consumption and forest regrowth. Historical data provided by MayaSim showed that the Mayans successfully maintained their ecosystems from about 800BCE to 700CE. When the Mayans practiced natural resource conservation and cultivated a harmonious relationship with the forest around them, they were able to thrive and prosper alongside nature. Having lasted over a thousand years, the Mayan empire provides a valuable lesson in sustainability and human attitudes towards the environment.Keywords: biodiversity, forestry, mayan, sustainability
Procedia PDF Downloads 17713036 CD133 and CD44 - Stem Cell Markers for Prediction of Clinically Aggressive Form of Colorectal Cancer
Authors: Ognen Kostovski, Svetozar Antovic, Rubens Jovanovic, Irena Kostovska, Nikola Jankulovski
Abstract:
Introduction:Colorectal carcinoma (CRC) is one of the most common malignancies in the world. The cancer stem cell (CSC) markers are associated with aggressive cancer types and poor prognosis. The aim of study was to determine whether the expression of colorectal cancer stem cell markers CD133 and CD44 could be significant in prediction of clinically aggressive form of CRC. Materials and methods: Our study included ninety patients (n=90) with CRC. Patients were divided into two subgroups: with metatstatic CRC and non-metastatic CRC. Tumor samples were analyzed with standard histopathological methods, than was performed immunohistochemical analysis with monoclonal antibodies against CD133 and CD44 stem cell markers. Results: High coexpression of CD133 and CD44 was observed in 71.4% of patients with metastatic disease, compared to 37.9% in patients without metastases. Discordant expression of both markers was found in 8% of the subgroup with metastatic CRC, and in 13.4% of the subgroup without metastatic CRC. Statistical analyses showed a significant association of increased expression of CD133 and CD44 with the disease stage, T - category and N - nodal status. With multiple regression analysis the stage of disease was designate as a factor with the greatest statistically significant influence on expression of CD133 (p <0.0001) and CD44 (p <0.0001). Conclusion: Our results suggest that the coexpression of CD133 and CD44 have an important role in prediction of clinically aggressive form of CRC. Both stem cell markers can be routinely implemented in standard pathohistological diagnostics and can be useful markers for pre-therapeutic oncology screening.Keywords: colorectal carcinoma, stem cells, CD133+, CD44+
Procedia PDF Downloads 15013035 Drive Sharing with Multimodal Interaction: Enhancing Safety and Efficiency
Authors: Sagar Jitendra Mahendrakar
Abstract:
Exploratory testing is a dynamic and adaptable method of software quality assurance that is frequently praised for its ability to find hidden flaws and improve the overall quality of the product. Instead of using preset test cases, exploratory testing allows testers to explore the software application dynamically. This is in contrast to scripted testing methodologies, which primarily rely on tester intuition, creativity, and adaptability. There are several tools and techniques that can aid testers in the exploratory testing process which we will be discussing in this talk.Tests of this kind are able to find bugs of this kind that are harder to find during structured testing or that other testing methods may have overlooked.The purpose of this abstract is to examine the nature and importance of exploratory testing in modern software development methods. It explores the fundamental ideas of exploratory testing, highlighting the value of domain knowledge and tester experience in spotting possible problems that may escape the notice of traditional testing methodologies. Throughout the software development lifecycle, exploratory testing promotes quick feedback loops and continuous improvement by giving testers the ability to make decisions in real time based on their observations. This abstract also clarifies the unique features of exploratory testing, like its non-linearity and capacity to replicate user behavior in real-world settings. Testers can find intricate bugs, usability problems, and edge cases in software through impromptu exploration that might go undetected. Exploratory testing's flexible and iterative structure fits in well with agile and DevOps processes, allowing for a quicker time to market without sacrificing the quality of the final product.Keywords: exploratory, testing, automation, quality
Procedia PDF Downloads 51