Search results for: mitigation techniques
7043 Enhancing Skills of Mothers of Asthmatic Children in Techniques of Drug Administration
Authors: Erna Judith Roach, Nalini Bhaskaranand
Abstract:
Background & Significance: Asthma is the most common chronic disease among children. Education is the cornerstone of management of asthma to help the affected children. In India there are about 1.5- 3.0 million asthmatic children in the age group of 5-11 years. Many parents face management dilemmas in administration of medications to their children. Mothers being primary caregivers of children are often responsible for administering medications to them. The purpose of the study was to develop an educational package on techniques of drug administration for mothers of asthmatic children and determine its effectiveness in terms of improvement in skill in drug administration. Methodology: A quasi- experimental time series pre-test post -test control group design was used. Mothers of asthmatic children attending paediatric outpatient departments of selected hospitals along with their children between 5 and 12 years were included. Sample size consisted of 40 mothers in the experimental and 40 mothers in the control groups. Block randomization was used to assign samples to both the groups. The data collection instruments used were Baseline Proforma, Clinical Proforma, Daily asthma drug intake and symptoms diary and Observation Rating Scales on technique of using a metered dose inhaler with spacer; metered dose inhaler with facemask; metered dose inhaler alone and dry powder inhaler. The educational package consisted of a video and booklet on techniques of drug administration. Data were collected at baseline, 1, 3 and 6 months. Findings: The mean post-test scores in techniques of drug administration were higher than the mean pre-test scores in the experimental group in all techniques. The Friedman test (p < 0.01), Wilcoxon Signed Rank test (p < 0.008) and Mann Whitney U (p < 0.01) showed statistically significant difference in the experimental group than the control group. There was significant decrease in the average number of symptom days (11 Vs. 4 days/ month) and hospital visits (5 to 1 per month) in the experimental group when compared to the control group. Conclusion: The educational package was found to be effective in improving the skill of mothers in drug administration in all the techniques, especially with using the metered dose inhaler with spacer.Keywords: childhood asthma, drug administration, mothers of children, inhaler
Procedia PDF Downloads 4267042 A Neural Network Based Clustering Approach for Imputing Multivariate Values in Big Data
Authors: S. Nickolas, Shobha K.
Abstract:
The treatment of incomplete data is an important step in the data pre-processing. Missing values creates a noisy environment in all applications and it is an unavoidable problem in big data management and analysis. Numerous techniques likes discarding rows with missing values, mean imputation, expectation maximization, neural networks with evolutionary algorithms or optimized techniques and hot deck imputation have been introduced by researchers for handling missing data. Among these, imputation techniques plays a positive role in filling missing values when it is necessary to use all records in the data and not to discard records with missing values. In this paper we propose a novel artificial neural network based clustering algorithm, Adaptive Resonance Theory-2(ART2) for imputation of missing values in mixed attribute data sets. The process of ART2 can recognize learned models fast and be adapted to new objects rapidly. It carries out model-based clustering by using competitive learning and self-steady mechanism in dynamic environment without supervision. The proposed approach not only imputes the missing values but also provides information about handling the outliers.Keywords: ART2, data imputation, clustering, missing data, neural network, pre-processing
Procedia PDF Downloads 2767041 Mitigation of Cascading Power Outage Caused Power Swing Disturbance Using Real-time DLR Applications
Authors: Dejenie Birile Gemeda, Wilhelm Stork
Abstract:
The power system is one of the most important systems in modern society. The existing power system is approaching the critical operating limits as views of several power system operators. With the increase of load demand, high capacity and long transmission networks are widely used to meet the requirement. With the integration of renewable energies such as wind and solar, the uncertainty, intermittence bring bigger challenges to the operation of power systems. These dynamic uncertainties in the power system lead to power disturbances. The disturbances in a heavily stressed power system cause distance relays to mal-operation or false alarms during post fault power oscillations. This unintended operation of these relays may propagate and trigger cascaded trappings leading to total power system blackout. This is due to relays inability to take an appropriate tripping decision based on ensuing power swing. According to the N-1 criterion, electric power systems are generally designed to withstand a single failure without causing the violation of any operating limit. As a result, some overloaded components such as overhead transmission lines can still work for several hours under overload conditions. However, when a large power swing happens in the power system, the settings of the distance relay of zone 3 may trip the transmission line with a short time delay, and they will be acting so quickly that the system operator has no time to respond and stop the cascading. Misfiring of relays in absence of fault due to power swing may have a significant loss in economic performance, thus a loss in revenue for power companies. This research paper proposes a method to distinguish stable power swing from unstable using dynamic line rating (DLR) in response to power swing or disturbances. As opposed to static line rating (SLR), dynamic line rating support effective mitigation actions against propagating cascading outages in a power grid. Effective utilization of existing transmission lines capacity using machine learning DLR predictions will improve the operating point of distance relay protection, thus reducing unintended power outages due to power swing.Keywords: blackout, cascading outages, dynamic line rating, power swing, overhead transmission lines
Procedia PDF Downloads 1467040 A Review of Quantitative Psychology in Our Life
Authors: Shubham Tandon, Rajni Goel
Abstract:
The prime objective of our review paper is to study the quantitative psychology impact on our daily life. Quantitative techniques have been studied with the aim of discovering solutions in an advanced way. To get the unbiased and correct results, statistics and other useful mathematical aspects have been reviewed. So, many psychologists use quantitative techniques while working in the area of psychology with the aim of discovering solutions in an advanced way. This ensures their accurate outcomes as those will make use of precise criteria in knowing the minds and conditions of any person. Also, proper experimentation and observational tools are taken care of to avoid some possibilities of invalid data.Keywords: quantitative psychology, psychologists, statistics, person, results, minds
Procedia PDF Downloads 1087039 Anticorrosive Properties of Poly(O-Phenylendiamine)/ZnO Nanocomposites Coated Stainless Steel
Authors: Aisha Ganash
Abstract:
Poly(o-phenylendiamine) and poly(ophenylendiamine)/ZnO(PoPd/ZnO) nanocomposites coating were prepared on type-304 austenitic stainless steel (SS) using H2SO4 acid as electrolyte by potentiostatic methods. Fourier transforms infrared spectroscopy and scanning electron microscopy techniques were used to characterize the composition and structure of PoPd/ZnO nanocomposites. The corrosion protection of polymer coatings ability was studied by Eocp-time measurement, anodic and cathodic potentiodynamic polarization and Impedance techniques in 3.5% NaCl as a corrosive solution. It was found that ZnO nanoparticles improve the barrier and electrochemical anticorrosive properties of poly(o-phenylendiamine).Keywords: anticorrosion, conducting polymers, electrochemistry, nanocomposites
Procedia PDF Downloads 2967038 Mining Big Data in Telecommunications Industry: Challenges, Techniques, and Revenue Opportunity
Authors: Hoda A. Abdel Hafez
Abstract:
Mining big data represents a big challenge nowadays. Many types of research are concerned with mining massive amounts of data and big data streams. Mining big data faces a lot of challenges including scalability, speed, heterogeneity, accuracy, provenance and privacy. In telecommunication industry, mining big data is like a mining for gold; it represents a big opportunity and maximizing the revenue streams in this industry. This paper discusses the characteristics of big data (volume, variety, velocity and veracity), data mining techniques and tools for handling very large data sets, mining big data in telecommunication and the benefits and opportunities gained from them.Keywords: mining big data, big data, machine learning, telecommunication
Procedia PDF Downloads 4137037 Fapitow: An Advanced AI Agent for Travel Agent Competition
Authors: Faiz Ul Haque Zeya
Abstract:
In this paper, Fapitow’s bidding strategy and approach to participate in Travel Agent Competition (TAC) is described. Previously, Fapitow is designed using the agents provided by the TAC Team and mainly used their modification for developing our strategy. But later, by observing the behavior of the agent, it is decided to come up with strategies that will be the main cause of improved utilities of the agent, and by theoretical examination, it is evident that the strategies will provide a significant improvement in performance which is later proved by agent’s performance in the games. The techniques and strategies for further possible improvement are also described. TAC provides a real-time, uncertain environment for learning, experimenting, and implementing various AI techniques. Some lessons learned about handling uncertain environments are also presented.Keywords: agent, travel agent competition, bidding, TAC
Procedia PDF Downloads 1177036 A Comprehensive Study of Spread Models of Wildland Fires
Authors: Manavjit Singh Dhindsa, Ursula Das, Kshirasagar Naik, Marzia Zaman, Richard Purcell, Srinivas Sampalli, Abdul Mutakabbir, Chung-Horng Lung, Thambirajah Ravichandran
Abstract:
These days, wildland fires, also known as forest fires, are more prevalent than ever. Wildfires have major repercussions that affect ecosystems, communities, and the environment in several ways. Wildfires lead to habitat destruction and biodiversity loss, affecting ecosystems and causing soil erosion. They also contribute to poor air quality by releasing smoke and pollutants that pose health risks, especially for individuals with respiratory conditions. Wildfires can damage infrastructure, disrupt communities, and cause economic losses. The economic impact of firefighting efforts, combined with their direct effects on forestry and agriculture, causes significant financial difficulties for the areas impacted. This research explores different forest fire spread models and presents a comprehensive review of various techniques and methodologies used in the field. A forest fire spread model is a computational or mathematical representation that is used to simulate and predict the behavior of a forest fire. By applying scientific concepts and data from empirical studies, these models attempt to capture the intricate dynamics of how a fire spreads, taking into consideration a variety of factors like weather patterns, topography, fuel types, and environmental conditions. These models assist authorities in understanding and forecasting the potential trajectory and intensity of a wildfire. Emphasizing the need for a comprehensive understanding of wildfire dynamics, this research explores the approaches, assumptions, and findings derived from various models. By using a comparison approach, a critical analysis is provided by identifying patterns, strengths, and weaknesses among these models. The purpose of the survey is to further wildfire research and management techniques. Decision-makers, researchers, and practitioners can benefit from the useful insights that are provided by synthesizing established information. Fire spread models provide insights into potential fire behavior, facilitating authorities to make informed decisions about evacuation activities, allocating resources for fire-fighting efforts, and planning for preventive actions. Wildfire spread models are also useful in post-wildfire mitigation strategies as they help in assessing the fire's severity, determining high-risk regions for post-fire dangers, and forecasting soil erosion trends. The analysis highlights the importance of customized modeling approaches for various circumstances and promotes our understanding of the way forest fires spread. Some of the known models in this field are Rothermel’s wildland fuel model, FARSITE, WRF-SFIRE, FIRETEC, FlamMap, FSPro, cellular automata model, and others. The key characteristics that these models consider include weather (includes factors such as wind speed and direction), topography (includes factors like landscape elevation), and fuel availability (includes factors like types of vegetation) among other factors. The models discussed are physics-based, data-driven, or hybrid models, also utilizing ML techniques like attention-based neural networks to enhance the performance of the model. In order to lessen the destructive effects of forest fires, this initiative aims to promote the development of more precise prediction tools and effective management techniques. The survey expands its scope to address the practical needs of numerous stakeholders. Access to enhanced early warning systems enables decision-makers to take prompt action. Emergency responders benefit from improved resource allocation strategies, strengthening the efficacy of firefighting efforts.Keywords: artificial intelligence, deep learning, forest fire management, fire risk assessment, fire simulation, machine learning, remote sensing, wildfire modeling
Procedia PDF Downloads 857035 Rapid Identification and Diagnosis of the Pathogenic Leptospiras through Comparison among Culture, PCR and Real Time PCR Techniques from Samples of Human and Mouse Feces
Authors: S. Rostampour Yasouri, M. Ghane, M. Doudi
Abstract:
Leptospirosis is one of the most significant infectious and zoonotic diseases along with global spreading. This disease is causative agent of economoic losses and human fatalities in various countries, including Northern provinces of Iran. The aim of this research is to identify and compare the rapid diagnostic techniques of pathogenic leptospiras, considering the multifacetedness of the disease from a clinical manifestation and premature death of patients. In the spring and summer of 2020-2022, 25 fecal samples were collected from suspected leptospirosis patients and 25 Fecal samples from mice residing in the rice fields and factories in Tonekabon city. Samples were prepared by centrifugation and passing through membrane filters. Culture technique was used in liquid and solid EMJH media during one month of incubation at 30°C. Then, the media were examined microscopically. DNA extraction was conducted by extraction Kit. Diagnosis of leptospiras was enforced by PCR and Real time PCR (SYBR Green) techniques using lipL32 specific primer. Out of the patients, 11 samples (44%) and 8 samples (32%) were determined to be pathogenic Leptospira by Real time PCR and PCR technique, respectively. Out of the mice, 9 Samples (36%) and 3 samples (12%) were determined to be pathogenic Leptospira by the mentioned techniques, respectively. Although the culture technique is considered to be the gold standard technique, but due to the slow growth of pathogenic Leptospira and lack of colony formation of some species, it is not a fast technique. Real time PCR allowed rapid diagnosis with much higher accuracy compared to PCR because PCR could not completely identify samples with lower microbial load.Keywords: culture, pathogenic leptospiras, PCR, real time PCR
Procedia PDF Downloads 897034 Strategic Cyber Sentinel: A Paradigm Shift in Enhancing Cybersecurity Resilience
Authors: Ayomide Oyedele
Abstract:
In the dynamic landscape of cybersecurity, "Strategic Cyber Sentinel" emerges as a revolutionary framework, transcending traditional approaches. This paper pioneers a holistic strategy, weaving together threat intelligence, machine learning, and adaptive defenses. Through meticulous real-world simulations, we demonstrate the unprecedented resilience of our framework against evolving cyber threats. "Strategic Cyber Sentinel" redefines proactive threat mitigation, offering a robust defense architecture poised for the challenges of tomorrow.Keywords: cybersecurity, resilience, threat intelligence, machine learning, adaptive defenses
Procedia PDF Downloads 877033 Optimized Techniques for Reducing the Reactive Power Generation in Offshore Wind Farms in India
Authors: Pardhasaradhi Gudla, Imanual A.
Abstract:
The generated electrical power in offshore needs to be transmitted to grid which is located in onshore by using subsea cables. Long subsea cables produce reactive power, which should be compensated in order to limit transmission losses, to optimize the transmission capacity, and to keep the grid voltage within the safe operational limits. Installation cost of wind farm includes the structure design cost and electrical system cost. India has targeted to achieve 175GW of renewable energy capacity by 2022 including offshore wind power generation. Due to sea depth is more in India, the installation cost will be further high when compared to European countries where offshore wind energy is already generating successfully. So innovations are required to reduce the offshore wind power project cost. This paper presents the optimized techniques to reduce the installation cost of offshore wind firm with respect to electrical transmission systems. This technical paper provides the techniques for increasing the current carrying capacity of subsea cable by decreasing the reactive power generation (capacitance effect) of the subsea cable. There are many methods for reactive power compensation in wind power plants so far in execution. The main reason for the need of reactive power compensation is capacitance effect of subsea cable. So if we diminish the cable capacitance of cable then the requirement of the reactive power compensation will be reduced or optimized by avoiding the intermediate substation at midpoint of the transmission network.Keywords: offshore wind power, optimized techniques, power system, sub sea cable
Procedia PDF Downloads 1987032 Electrocardiogram Signal Denoising Using a Hybrid Technique
Authors: R. Latif, W. Jenkal, A. Toumanari, A. Hatim
Abstract:
This paper presents an efficient method of electrocardiogram signal denoising based on a hybrid approach. Two techniques are brought together to create an efficient denoising process. The first is an Adaptive Dual Threshold Filter (ADTF) and the second is the Discrete Wavelet Transform (DWT). The presented approach is based on three steps of denoising, the DWT decomposition, the ADTF step and the highest peaks correction step. This paper presents some application of the approach on some electrocardiogram signals of the MIT-BIH database. The results of these applications are promising compared to other recently published techniques.Keywords: hybrid technique, ADTF, DWT, thresholding, ECG signal
Procedia PDF Downloads 3267031 An Experimental Study of Iron Smelting Techniques Used in the South East Rajasthan, with Special Reference to Nathara-Ki-Pal, Udaipur
Authors: Udaya Kumar
Abstract:
The aim of this paper is to discuss recent research conducted in experimental studies related to the process of the iron smelting. The paper will discuss issues related to the selection of iron ore, structure of furnace, making of tuyeres, fashioning of blowers and firing temperatures through experiments conducted recently and scientific analyses of experimental work. Experiments were conducted in order to investigate iron smelting techniques used at the Early Historic site of Nathara-Ki-Pal. (73°47’E; 24°16N is located about 70 km south-east of Udaipur city). Geographically, Nathara-Ki-Pal has located the foot hills of Aravalli’s. Iron ore and iron slag can be seen on the surface of the site. The remains of 4 broken furnaces were recovered during excavations (2007 and 2008) and the site was excavated by Prof. Pandey from the Department of Archaeology of the Institute of Rajasthan studies, Rajasthan Vidyapeeth University. This shows that the site of Nathara-Ki-Pal was a center of iron smelting. Results of experiments performed both in the field reconstruction of a bloomery furnace and in the laboratory are discussed.Keywords: experimental studies, furnace, smelting techniques, making of tuyeres
Procedia PDF Downloads 1927030 Near-Infrared Hyperspectral Imaging Spectroscopy to Detect Microplastics and Pieces of Plastic in Almond Flour
Authors: H. Apaza, L. Chévez, H. Loro
Abstract:
Plastic and microplastic pollution in human food chain is a big problem for human health that requires more elaborated techniques that can identify their presences in different kinds of food. Hyperspectral imaging technique is an optical technique than can detect the presence of different elements in an image and can be used to detect plastics and microplastics in a scene. To do this statistical techniques are required that need to be evaluated and compared in order to find the more efficient ones. In this work, two problems related to the presence of plastics are addressed, the first is to detect and identify pieces of plastic immersed in almond seeds, and the second problem is to detect and quantify microplastic in almond flour. To do this we make use of the analysis hyperspectral images taken in the range of 900 to 1700 nm using 4 unmixing techniques of hyperspectral imaging which are: least squares unmixing (LSU), non-negatively constrained least squares unmixing (NCLSU), fully constrained least squares unmixing (FCLSU), and scaled constrained least squares unmixing (SCLSU). NCLSU, FCLSU, SCLSU techniques manage to find the region where the plastic is found and also manage to quantify the amount of microplastic contained in the almond flour. The SCLSU technique estimated a 13.03% abundance of microplastics and 86.97% of almond flour compared to 16.66% of microplastics and 83.33% abundance of almond flour prepared for the experiment. Results show the feasibility of applying near-infrared hyperspectral image analysis for the detection of plastic contaminants in food.Keywords: food, plastic, microplastic, NIR hyperspectral imaging, unmixing
Procedia PDF Downloads 1347029 Comparative Study of Various Treatment Positioning Technique: A Site Specific Study-CA. Breast
Authors: Kamal Kaushik, Dandpani Epili, Ajay G. V., Ashutosh, S. Pradhaan
Abstract:
Introduction: Radiation therapy has come a long way over a period of decades, from 2-dimensional radiotherapy to intensity-modulated radiation therapy (IMRT) or VMAT. For advanced radiation therapy, we need better patient position reproducibility to deliver precise and quality treatment, which raises the need for better image guidance technologies for precise patient positioning. This study presents a two tattoo simulation with roll correction technique which is comparable to other advanced patient positioning techniques. Objective: This is a site-specific study is aimed to perform a comparison between various treatment positioning techniques used for the treatment of patients of Ca- Breast undergoing radiotherapy. In this study, we are comparing 5 different positioning methods used for the treatment of ca-breast, namely i) Vacloc with 3 tattoos, ii) Breast board with three tattoos, iii) Thermoplastic cast with three fiducials, iv) Breast board with a thermoplastic mask with 3 tattoo, v) Breast board with 2 tattoos – A roll correction method. Methods and material: All in one (AIO) solution immobilization was used in all patient positioning techniques for immobilization. The process of two tattoo simulations includes positioning of the patient with the help of a thoracic-abdomen wedge, armrest & knee rest. After proper patient positioning, we mark two tattoos on the treatment side of the patient. After positioning, place fiducials as per the clinical borders markers (1) sternum notch (lower border of clavicle head) (2) 2 cm below from contralateral breast (3) midline between 1 & 2 markers (4) mid axillary on the same axis of 3 markers (Marker 3 & 4 should be on the same axis). During plan implementation, a roll depth correction is applied as per the anterior and lateral positioning tattoos, followed by the shifts required for the Isocentre position. The shifts are then verified by SSD on the patient surface followed by radiographic verification using Cone Beam Computed Tomography (CBCT). Results: When all the five positioning techniques were compared all together, the produced shifts in Vertical, Longitudinal and lateral directions are as follows. The observations clearly suggest that the Longitudinal average shifts in two tattoo roll correction techniques are less than every other patient positioning technique. Vertical and lateral Shifts are also comparable to other modern positioning techniques. Concluded: The two tattoo simulation with roll correction technique provides us better patient setup with a technique that can be implemented easily in most of the radiotherapy centers across the developing nations where 3D verification techniques are not available along with delivery units as the shifts observed are quite minimal and are comparable to those with Vacloc and modern amenities.Keywords: Ca. breast, breast board, roll correction technique, CBCT
Procedia PDF Downloads 1407028 Multi-Cluster Overlapping K-Means Extension Algorithm (MCOKE)
Authors: Said Baadel, Fadi Thabtah, Joan Lu
Abstract:
Clustering involves the partitioning of n objects into k clusters. Many clustering algorithms use hard-partitioning techniques where each object is assigned to one cluster. In this paper, we propose an overlapping algorithm MCOKE which allows objects to belong to one or more clusters. The algorithm is different from fuzzy clustering techniques because objects that overlap are assigned a membership value of 1 (one) as opposed to a fuzzy membership degree. The algorithm is also different from other overlapping algorithms that require a similarity threshold to be defined as a priority which can be difficult to determine by novice users.Keywords: data mining, k-means, MCOKE, overlapping
Procedia PDF Downloads 5787027 Generating Individualized Wildfire Risk Assessments Utilizing Multispectral Imagery and Geospatial Artificial Intelligence
Authors: Gus Calderon, Richard McCreight, Tammy Schwartz
Abstract:
Forensic analysis of community wildfire destruction in California has shown that reducing or removing flammable vegetation in proximity to buildings and structures is one of the most important wildfire defenses available to homeowners. State laws specify the requirements for homeowners to create and maintain defensible space around all structures. Unfortunately, this decades-long effort had limited success due to noncompliance and minimal enforcement. As a result, vulnerable communities continue to experience escalating human and economic costs along the wildland-urban interface (WUI). Quantifying vegetative fuels at both the community and parcel scale requires detailed imaging from an aircraft with remote sensing technology to reduce uncertainty. FireWatch has been delivering high spatial resolution (5” ground sample distance) wildfire hazard maps annually to the community of Rancho Santa Fe, CA, since 2019. FireWatch uses a multispectral imaging system mounted onboard an aircraft to create georeferenced orthomosaics and spectral vegetation index maps. Using proprietary algorithms, the vegetation type, condition, and proximity to structures are determined for 1,851 properties in the community. Secondary data processing combines object-based classification of vegetative fuels, assisted by machine learning, to prioritize mitigation strategies within the community. The remote sensing data for the 10 sq. mi. community is divided into parcels and sent to all homeowners in the form of defensible space maps and reports. Follow-up aerial surveys are performed annually using repeat station imaging of fixed GPS locations to address changes in defensible space, vegetation fuel cover, and condition over time. These maps and reports have increased wildfire awareness and mitigation efforts from 40% to over 85% among homeowners in Rancho Santa Fe. To assist homeowners fighting increasing insurance premiums and non-renewals, FireWatch has partnered with Black Swan Analytics, LLC, to leverage the multispectral imagery and increase homeowners’ understanding of wildfire risk drivers. For this study, a subsample of 100 parcels was selected to gain a comprehensive understanding of wildfire risk and the elements which can be mitigated. Geospatial data from FireWatch’s defensible space maps was combined with Black Swan’s patented approach using 39 other risk characteristics into a 4score Report. The 4score Report helps property owners understand risk sources and potential mitigation opportunities by assessing four categories of risk: Fuel sources, ignition sources, susceptibility to loss, and hazards to fire protection efforts (FISH). This study has shown that susceptibility to loss is the category residents and property owners must focus their efforts. The 4score Report also provides a tool to measure the impact of homeowner actions on risk levels over time. Resiliency is the only solution to breaking the cycle of community wildfire destruction and it starts with high-quality data and education.Keywords: defensible space, geospatial data, multispectral imaging, Rancho Santa Fe, susceptibility to loss, wildfire risk.
Procedia PDF Downloads 1117026 Performance Analysis of Vision-Based Transparent Obstacle Avoidance for Construction Robots
Authors: Siwei Chang, Heng Li, Haitao Wu, Xin Fang
Abstract:
Construction robots are receiving more and more attention as a promising solution to the manpower shortage issue in the construction industry. The development of intelligent control techniques that assist in controlling the robots to avoid transparency and reflected building obstacles is crucial for guaranteeing the adaptability and flexibility of mobile construction robots in complex construction environments. With the boom of computer vision techniques, a number of studies have proposed vision-based methods for transparent obstacle avoidance to improve operation accuracy. However, vision-based methods are also associated with disadvantages such as high computational costs. To provide better perception and value evaluation, this study aims to analyze the performance of vision-based techniques for avoiding transparent building obstacles. To achieve this, commonly used sensors, including a lidar, an ultrasonic sensor, and a USB camera, are equipped on the robotic platform to detect obstacles. A Raspberry Pi 3 computer board is employed to compute data collecting and control algorithms. The turtlebot3 burger is employed to test the programs. On-site experiments are carried out to observe the performance in terms of success rate and detection distance. Control variables include obstacle shapes and environmental conditions. The findings contribute to demonstrating how effectively vision-based obstacle avoidance strategies for transparent building obstacle avoidance and provide insights and informed knowledge when introducing computer vision techniques in the aforementioned domain.Keywords: construction robot, obstacle avoidance, computer vision, transparent obstacle
Procedia PDF Downloads 837025 Intrusion Detection and Prevention System (IDPS) in Cloud Computing Using Anomaly-Based and Signature-Based Detection Techniques
Authors: John Onyima, Ikechukwu Ezepue
Abstract:
Virtualization and cloud computing are among the fast-growing computing innovations in recent times. Organisations all over the world are moving their computing services towards the cloud this is because of its rapid transformation of the organization’s infrastructure and improvement of efficient resource utilization and cost reduction. However, this technology brings new security threats and challenges about safety, reliability and data confidentiality. Evidently, no single security technique can guarantee security or protection against malicious attacks on a cloud computing network hence an integrated model of intrusion detection and prevention system has been proposed. Anomaly-based and signature-based detection techniques will be integrated to enable the network and its host defend themselves with some level of intelligence. The anomaly-base detection was implemented using the local deviation factor graph-based (LDFGB) algorithm while the signature-based detection was implemented using the snort algorithm. Results from this collaborative intrusion detection and prevention techniques show robust and efficient security architecture for cloud computing networks.Keywords: anomaly-based detection, cloud computing, intrusion detection, intrusion prevention, signature-based detection
Procedia PDF Downloads 3137024 Diplomatic Public Relations Techniques for Official Recognition of Palestine State in Europe
Authors: Bilgehan Gultekin, Tuba Gultekin
Abstract:
Diplomatic public relations gives an ideal concept for recognition of palestine state in all over the europe. The first step of official recognition is approval of palestine state in international political organisations such as United Nations and Nato. So, diplomatic public relations provides a recognition process in communication scale. One of the aims of the study titled “Diplomatic Public Relations Techniques for Recognition of Palestine State in Europe” is to present some communication projects on diplomatic way. The study also aims at showing communication process at diplomatic level. The most important level of such kind of diplomacy is society based diplomacy. Moreover,The study provides a wider perspective that gives some creative diplomatic communication strategies for attracting society. To persuade the public for official recognition also is key element of this process. The study also finds new communication routes including persuasion techniques for society. All creative projects are supporting parts in original persuasive process of official recognition of Palestine.Keywords: diplomatic public relations, diplomatic communication strategies, diplomatic communication, public relations
Procedia PDF Downloads 4577023 Extraction of Squalene from Lebanese Olive Oil
Authors: Henri El Zakhem, Christina Romanos, Charlie Bakhos, Hassan Chahal, Jessica Koura
Abstract:
Squalene is a valuable component of the oil composed of 30 carbon atoms and is mainly used for cosmetic materials. The main concern of this article is to study the Squalene composition in the Lebanese olive oil and to compare it with foreign oil results. To our knowledge, extraction of Squalene from the Lebanese olive oil has not been conducted before. Three different techniques were studied and experiments were performed on three brands of olive oil, Al Wadi Al Akhdar, Virgo Bio and Boulos. The techniques performed are the Fractional Crystallization, the Soxhlet and the Esterification. By comparing the results, it is found that the Lebanese oil contains squalene and Soxhlet method is the most effective between the three methods extracting about 6.5E-04 grams of Squalene per grams of olive oil.Keywords: squalene, extraction, crystallization, Soxhlet
Procedia PDF Downloads 5237022 Urban Noise and Air Quality: Correlation between Air and Noise Pollution; Sensors, Data Collection, Analysis and Mapping in Urban Planning
Authors: Massimiliano Condotta, Paolo Ruggeri, Chiara Scanagatta, Giovanni Borga
Abstract:
Architects and urban planners, when designing and renewing cities, have to face a complex set of problems, including the issues of noise and air pollution which are considered as hot topics (i.e., the Clean Air Act of London and the Soundscape definition). It is usually taken for granted that these problems go by together because the noise pollution present in cities is often linked to traffic and industries, and these produce air pollutants as well. Traffic congestion can create both noise pollution and air pollution, because NO₂ is mostly created from the oxidation of NO, and these two are notoriously produced by processes of combustion at high temperatures (i.e., car engines or thermal power stations). We can see the same process for industrial plants as well. What have to be investigated – and is the topic of this paper – is whether or not there really is a correlation between noise pollution and air pollution (taking into account NO₂) in urban areas. To evaluate if there is a correlation, some low-cost methodologies will be used. For noise measurements, the OpeNoise App will be installed on an Android phone. The smartphone will be positioned inside a waterproof box, to stay outdoor, with an external battery to allow it to collect data continuously. The box will have a small hole to install an external microphone, connected to the smartphone, which will be calibrated to collect the most accurate data. For air, pollution measurements will be used the AirMonitor device, an Arduino board to which the sensors, and all the other components, are plugged. After assembling the sensors, they will be coupled (one noise and one air sensor) and placed in different critical locations in the area of Mestre (Venice) to map the existing situation. The sensors will collect data for a fixed period of time to have an input for both week and weekend days, in this way it will be possible to see the changes of the situation during the week. The novelty is that data will be compared to check if there is a correlation between the two pollutants using graphs that should show the percentage of pollution instead of the values obtained with the sensors. To do so, the data will be converted to fit on a scale that goes up to 100% and will be shown thru a mapping of the measurement using GIS methods. Another relevant aspect is that this comparison can help to choose which are the right mitigation solutions to be applied in the area of the analysis because it will make it possible to solve both the noise and the air pollution problem making only one intervention. The mitigation solutions must consider not only the health aspect but also how to create a more livable space for citizens. The paper will describe in detail the methodology and the technical solution adopted for the realization of the sensors, the data collection, noise and pollution mapping and analysis.Keywords: air quality, data analysis, data collection, NO₂, noise mapping, noise pollution, particulate matter
Procedia PDF Downloads 2147021 Constructability Driven Engineering in Oil and Gas Projects
Authors: Srikanth Nagarajan, P. Parthasarathy, Frits Lagers
Abstract:
Lower crude oil prices increased the pressure on oil and gas projects. Being competitive becomes very important and critical for the success in any industry. Increase in size of the project multiplies the magnitude of the issue. Timely completion of projects within the budget and schedule is very important for any project to succeed. A simple idea makes a larger impact on the total cost of the plant. In this robust world, the phases of engineering right from licensing technology, feed, different phases of detail engineering, procurement and construction has been so much compressed that they overlap with each other. Hence constructability techniques have become very important. Here in this paper, the focus will be on how these techniques can be implemented and reduce cost with the help of a case study. Constructability is a process driven by the need to impact project’s construction phase resulting in improved project delivery, costs and schedule. In construction phase of one of our fast-track mega project, it was noticed that there was an opportunity to reduce significant amount of cost and schedule by implementing Constructability study processes. In this case study, the actual methodology adopted during engineering and construction and the way for doing it better by implementing Constructability techniques with collaborative engineering efforts will be explained.Keywords: being competitive, collaborative engineering, constructability, cost reduction
Procedia PDF Downloads 4277020 A Risk Management Approach for Nigeria Manufacturing Industries
Authors: Olaniyi O. Omoyajowo
Abstract:
To be successful in today’s competitive global environment, manufacturing industry must be able to respond quickly to changes in technology. These changes in technology introduce new risks and hazards. The management of risk/hazard in a manufacturing process recommends method through which the success rate of an organization can be increased. Thus, there is a continual need for manufacturing industries to invest significant amount of resources in risk management, which in turn optimizes the production output and profitability of any manufacturing industry (if implemented properly). To help improve the existing risk prevention and mitigation practices in Small and Medium Enterprise (SME) in Nigeria Manufacturing Industries (NMI), the researcher embarks on this research to develop a systematic Risk Management process.Keywords: manufacturing management, risk, risk management, SMEs
Procedia PDF Downloads 4077019 Fraud in the Higher Educational Institutions in Assam, India: Issues and Challenges
Authors: Kalidas Sarma
Abstract:
Fraud is a social problem changing with social change and it has a regional and global impact. Introduction of private domain in higher education along with public institutions has led to commercialization of higher education which encourages unprecedented mushrooming of private institutions resulting in fraudulent activities in higher educational institutions in Assam, India. Presently, fraud has been noticed in in-service promotion, fake entry qualification by teachers in different levels of work-place by using fake master degrees, master of philosophy and doctor of philosophy degree certificates. The aim and objective of the study are to identify grey areas in maintenance of quality in higher educational institutions in Assam and also to draw the contour for planning and implementation. This study is based on both primary and secondary data collected through questionnaire and seeking information through Right to Information Act 2005. In Assam, there are 301 undergraduate and graduate colleges distributed in 27 (Twenty seven) administrative districts with 11000 (Eleven thousand) college teachers. Total 421 (Four hundred twenty one) college teachers from the 14 respondent colleges have been taken for analysis. Data collected has been analyzed by using 'Hypertext Pre-processor' (PhP) application with My Sequel Structure Query Language (MySQL) and Google Map Application Programming Interface (APIs). Graph has been generated by using open source tool Chart.js. Spatial distribution maps have been generated with the help of geo-references of the colleges. The result shows: (i) the violation of University Grants Commission's (UGCs) Regulation for the awards of M. Phil/Ph.D. clearly exhibits. (ii) There is a gap between apex regulatory bodies of higher education at national and as well as state level to check fraud. (iii) Mala fide 'No Objection Certificate' (NOC) issued by the Government of Assam have played pivotal role in the occurrence of fraudulent practices in higher educational institutions of Assam. (iv) Violation of verdict of the Hon'ble Supreme Court of India regarding territorial jurisdiction of Universities for the awards of Ph.D. and M. Phil degrees in distance mode/study centre is also a responsible factor for the spread of these academic frauds in Assam and other states. The challenges and mitigation of these issues have been discussed.Keywords: Assam, fraud, higher education, mitigation
Procedia PDF Downloads 1737018 The Interdisciplinary Synergy Between Computer Engineering and Mathematics
Authors: Mitat Uysal, Aynur Uysal
Abstract:
Computer engineering and mathematics share a deep and symbiotic relationship, with mathematics providing the foundational theories and models for computer engineering advancements. From algorithm development to optimization techniques, mathematics plays a pivotal role in solving complex computational problems. This paper explores key mathematical principles that underpin computer engineering, illustrating their significance through a case study that demonstrates the application of optimization techniques using Python code. The case study addresses the well-known vehicle routing problem (VRP), an extension of the traveling salesman problem (TSP), and solves it using a genetic algorithm.Keywords: VRP, TSP, genetic algorithm, computer engineering, optimization
Procedia PDF Downloads 207017 Synthesis of Magnetic Plastic Waste-Reduced Graphene Oxide Composite and Its Application in Dye Adsorption from Aqueous Solution
Authors: Pamphile Ndagijimana, Xuejiao Liu, Zhiwei Li, Yin Wang
Abstract:
The valorization of plastic wastes, as a mitigation strategy, is attracting the researchers’ attention since these wastes have raised serious environmental concerns. Plastic wastes have been reported to adsorb the organic pollutants in the water environment and to be the main vector of those pollutants in the aquatic environment, especially dyes, as a serious water pollution concern. Recycling technologies of plastic wastes such as landfills, incineration, and energy recovery have been adopted to manage those wastes before getting exposed to the environment. However, they are far from being widely accepted due to their related environmental pollution, lack of space for the landfill as well as high cost. Therefore, modification is necessary for green plastic adsorbent in water applications. Current routes for plastic modification into adsorbents are based on the combustion method, but they have weaknesses of air pollution as well as high cost. Thus, the green strategy for plastic modification into adsorbents is highly required. Furthermore, recent researchers recommended that if plastic wastes are combined with other solid carbon materials, they could promote their application in water treatment. Herein, we present new insight into using plastic waste-based materials as future green adsorbents. Magnetic plastic-reduced graphene oxide (MPrGO) composite was synthesized by cross-linking method and applied in removing methylene blue (MB) from an aqueous solution. Furthermore, the following advantages have been achieved: (i) The density of plastic and reduced graphene oxide were enhanced, (ii) no second pollution of black color in solution, (iii) small amount of graphene oxide (1%) was linked on 10g of plastic waste, and the composite presented the high removal efficiency, (iv) easy recovery of adsorbent from water. The low concentration of MB (10-30mg/L) was all removed by 0.3g of MPrGO. Different characterization techniques such as XRD, SEM, FTIR, BET, XPS, and Raman spectroscopy were performed, and the results confirmed a conjugation between plastic waste and graphene oxide. This MPrGO composite presented a good prospect for the valorization of plastic waste, and it is a promising composite material in water treatment.Keywords: plastic waste, graphene oxide, dye, adsorption
Procedia PDF Downloads 947016 Translation and Legal Terminology: Techniques for Coping with the Untranslatability of Legal Terms between Arabic and English
Authors: Rafat Alwazna
Abstract:
Technical lexicon is witnessing a large upsurge in the use of new terminologies whose emergence is an inevitable result of the spread of high-quality technology, the existence of scientific paradigms and the fast growth of research in different disciplines. One important subfield of terminology is legal terminology, which forms a crucial part of legal studies, and whose translation from one legal system into another is deemed a formidable and arduous task that needs to be properly performed by legal translators. Indeed, the issue of untranslatability of legal terms, particularly between originally unrelated languages, like legal Arabic and legal English, has long been a real challenge in legal translation. It stems from the conceptual incongruency between legal terms of different legal languages, which are derived from different legal cultures and legal systems. Such conceptual asymmetry is owing to the fact that law has no universal reference and that legal language is what determines the degree of difference in conceptual correspondence. The present paper argues that although conceptual asymmetry, which is the main reason for the issue of untranslatability of legal terms, cannot be denied in legal translation, there exist certain translation techniques which, if properly adopted, would resolve the issue of untranslatability of legal terms and therefore achieve acceptable legal translation. Hence, the question of untranslatability of legal terms should no longer exist within the context of legal translation.Keywords: conceptual incongruency, Legal terms, translation techniques, untranslatability
Procedia PDF Downloads 2057015 Impact of aSolar System Designed to Improve the Microclimate of an Agricultural Greenhouse
Authors: Nora Arbaoui, Rachid Tadili, Ilham Ihoume
Abstract:
The improvement of the agricultural production and food preservation processes requires the introduction of heating and cooling techniques in greenhouses. To develop these techniques, our work proposes a design of an integrated and autonomous solar system for heating, cooling, and production conservation in greenhouses. The hot air produced by the greenhouse effect during the day will be evacuated to compartments annexed in the greenhouse to dry the surplus agricultural production that is not sold on the market. In this paper, we will give a description of this solar system and the calculation of the fluid’s volume used for heat storage that will be released during the night.Keywords: solar system, agricultural greenhouse, heating, cooling, storage, drying
Procedia PDF Downloads 1107014 Unlocking the Potential of Short Texts with Semantic Enrichment, Disambiguation Techniques, and Context Fusion
Authors: Mouheb Mehdoui, Amel Fraisse, Mounir Zrigui
Abstract:
This paper explores the potential of short texts through semantic enrichment and disambiguation techniques. By employing context fusion, we aim to enhance the comprehension and utility of concise textual information. The methodologies utilized are grounded in recent advancements in natural language processing, which allow for a deeper understanding of semantics within limited text formats. Specifically, topic classification is employed to understand the context of the sentence and assess the relevance of added expressions. Additionally, word sense disambiguation is used to clarify unclear words, replacing them with more precise terms. The implications of this research extend to various applications, including information retrieval and knowledge representation. Ultimately, this work highlights the importance of refining short text processing techniques to unlock their full potential in real-world applications.Keywords: information traffic, text summarization, word-sense disambiguation, semantic enrichment, ambiguity resolution, short text enhancement, information retrieval, contextual understanding, natural language processing, ambiguity
Procedia PDF Downloads 20