Search results for: image processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5836

Search results for: image processing

4066 A Supervised Approach for Word Sense Disambiguation Based on Arabic Diacritics

Authors: Alaa Alrakaf, Sk. Md. Mizanur Rahman

Abstract:

Since the last two decades’ Arabic natural language processing (ANLP) has become increasingly much more important. One of the key issues related to ANLP is ambiguity. In Arabic language different pronunciation of one word may have a different meaning. Furthermore, ambiguity also has an impact on the effectiveness and efficiency of Machine Translation (MT). The issue of ambiguity has limited the usefulness and accuracy of the translation from Arabic to English. The lack of Arabic resources makes ambiguity problem more complicated. Additionally, the orthographic level of representation cannot specify the exact meaning of the word. This paper looked at the diacritics of Arabic language and used them to disambiguate a word. The proposed approach of word sense disambiguation used Diacritizer application to Diacritize Arabic text then found the most accurate sense of an ambiguous word using Naïve Bayes Classifier. Our Experimental study proves that using Arabic Diacritics with Naïve Bayes Classifier enhances the accuracy of choosing the appropriate sense by 23% and also decreases the ambiguity in machine translation.

Keywords: Arabic natural language processing, machine learning, machine translation, Naive bayes classifier, word sense disambiguation

Procedia PDF Downloads 358
4065 “Presently”: A Personal Trainer App to Self-Train and Improve Presentation Skills

Authors: Shyam Mehraaj, Samanthi E. R. Siriwardana, Shehara A. K. G. H., Wanigasinghe N. T., Wandana R. A. K., Wedage C. V.

Abstract:

A presentation is a critical tool for conveying not just spoken information but also a wide spectrum of human emotions. The single most effective thing to make the presentation successful is to practice it beforehand. Preparing for a presentation has been shown to be essential for improving emotional control, intonation and prosody, pronunciation, and vocabulary, as well as the quality of the presentation slides. As a result, practicing has become one of the most critical parts of giving a good presentation. In this research, the main focus is to analyze the audio, video, and slides of the presentation uploaded by the presenters. This proposed solution is based on the Natural Language Processing and Computer Vision techniques to cater to the requirement for the presenter to do a presentation beforehand using a mobile responsive web application. The proposed system will assist in practicing the presentation beforehand by identifying the presenters’ emotions, body language, tonality, prosody, pronunciations and vocabulary, and presentation slides quality. Overall, the system will give a rating and feedback to the presenter about the performance so that the presenters’ can improve their presentation skills.

Keywords: presentation, self-evaluation, natural learning processing, computer vision

Procedia PDF Downloads 118
4064 Integrated Geophysical Approach for Subsurface Delineation in Srinagar, Uttarakhand, India

Authors: Pradeep Kumar Singh Chauhan, Gayatri Devi, Zamir Ahmad, Komal Chauhan, Abha Mittal

Abstract:

The application of geophysical methods to study the subsurface profile for site investigation is becoming popular globally. These methods are non-destructive and provide the image of subsurface at shallow depths. Seismic refraction method is one of the most common and efficient method being used for civil engineering site investigations particularly for knowing the seismic velocity of the subsurface layers. Resistivity imaging technique is a geo-electrical method used to image the subsurface, water bearing zone, bedrock and layer thickness. Integrated approach combining seismic refraction and 2-D resistivity imaging will provide a better and reliable picture of the subsurface. These are economical and less time-consuming field survey which provide high resolution image of the subsurface. Geophysical surveys carried out in this study include seismic refraction and 2D resistivity imaging method for delineation of sub-surface strata in different parts of Srinagar, Garhwal Himalaya, India. The aim of this survey was to map the shallow subsurface in terms of geological and geophysical properties mainly P-wave velocity, resistivity, layer thickness, and lithology of the area. Both sides of the river, Alaknanda which flows through the centre of the city, have been covered by taking two profiles on each side using both methods. Seismic and electrical surveys were carried out at the same locations to complement the results of each other. The seismic refraction survey was carried out using ABEM TeraLoc 24 channel Seismograph and 2D resistivity imaging was performed using ABEM Terrameter LS equipment. The results show three distinct layers on both sides of the river up to the depth of 20 m. The subsurface is divided into three distinct layers namely, alluvium extending up to, 3 m depth, conglomerate zone lying between the depth of 3 m to 15 m, and compacted pebbles and cobbles beyond 15 m. P-wave velocity in top layer is found in the range of 400 – 600 m/s, in second layer it varies from 700 – 1100 m/s and in the third layer it is 1500 – 3300 m/s. The resistivity results also show similar pattern and were in good agreement with seismic refraction results. The results obtained in this study were validated with an available exposed river scar at one site. The study established the efficacy of geophysical methods for subsurface investigations.

Keywords: 2D resistivity imaging, P-wave velocity, seismic refraction survey, subsurface

Procedia PDF Downloads 258
4063 Multi-Temporal Cloud Detection and Removal in Satellite Imagery for Land Resources Investigation

Authors: Feng Yin

Abstract:

Clouds are inevitable contaminants in optical satellite imagery, and prevent the satellite imaging systems from acquiring clear view of the earth surface. The presence of clouds in satellite imagery bring negative influences for remote sensing land resources investigation. As a consequence, detecting the locations of clouds in satellite imagery is an essential preprocessing step, and further remove the existing clouds is crucial for the application of imagery. In this paper, a multi-temporal based satellite imagery cloud detection and removal method is proposed, which will be used for large-scale land resource investigation. The proposed method is mainly composed of four steps. First, cloud masks are generated for cloud contaminated images by single temporal cloud detection based on multiple spectral features. Then, a cloud-free reference image of target areas is synthesized by weighted averaging time-series images in which cloud pixels are ignored. Thirdly, the refined cloud detection results are acquired by multi-temporal analysis based on the reference image. Finally, detected clouds are removed via multi-temporal linear regression. The results of a case application in Hubei province indicate that the proposed multi-temporal cloud detection and removal method is effective and promising for large-scale land resource investigation.

Keywords: cloud detection, cloud remove, multi-temporal imagery, land resources investigation

Procedia PDF Downloads 278
4062 Drying of Agro-Industrial Wastes Using an Indirect Solar Dryer

Authors: N. Metidji, N. Kasbadji Merzouk, O. Badaoui, R. Sellami, A. Djebli

Abstract:

The Agro-industry is considered as one of the most waste producing industrial fields as a result of food processing. Upgrading and reuse of these wastes as animal or poultry food seems to be a promising alternative. Combined with the use of clean energy resources, the recovery process would contribute more to the environment protection. It is in this framework that a new solar dryer has been designed in the Unit of Solar Equipments Development. Indirect solar drying has, also, many advantages compared to natural sun drying. In fact, the first does not cause product degradation as it is protected by the drying chamber from direct sun, insects and exterior environment. The aim of this work is to study the drying kinetics of waste, generated during the processing of orange to make fruit juice, by using an indirect forced convection solar dryer at 50 °C and 60 °C, the rate of moisture removal from the product to be dried has been found to be directly related to temperature, humidity and flow rate. The characterization of these parameters has allowed the determination of the appropriate drying time for this product namely orange waste.

Keywords: solar energy, solar dryer, energy conversion, orange drying, forced convection solar dryer

Procedia PDF Downloads 354
4061 Thermo-Mechanical Processing of Armor Steel Plates

Authors: Taher El-Bitar, Maha El-Meligy, Eman El-Shenawy, Almosilhy Almosilhy, Nader Dawood

Abstract:

The steel contains 0.3% C and 0.004% B, beside Mn, Cr, Mo, and Ni. The alloy was processed by using 20-ton capacity electric arc furnace (EAF), and then refined by ladle furnace (LF). Liquid steel was cast as rectangular ingots. Dilatation test showed the critical transformation temperatures Ac1, Ac3, Ms and Mf as 716, 835, 356, and 218 °C. The ingots were austenitized and soaked and then rough rolled to thin slabs with 80 mm thickness. The thin slabs were then reheated and soaked for finish rolling to 6.0 mm thickness plates. During the rough rolling, the roll force increases as a result of rolling at temperatures less than recrystallization temperature. However, during finish rolling, the steel reflects initially continuous static recrystallization after which it shows strain hardening due to fall of temperature. It was concluded that, the steel plates were successfully heat treated by quenching-tempering at 250 ºC for 20 min.

Keywords: armor steel, austenitizing, critical transformation temperatures (CTTs), dilatation curve, martensite, quenching, rough and finish rolling processes, soaking, tempering, thermo-mechanical processing

Procedia PDF Downloads 347
4060 A Practical Survey on Zero-Shot Prompt Design for In-Context Learning

Authors: Yinheng Li

Abstract:

The remarkable advancements in large language models (LLMs) have brought about significant improvements in natural language processing tasks. This paper presents a comprehensive review of in-context learning techniques, focusing on different types of prompts, including discrete, continuous, few-shot, and zero-shot, and their impact on LLM performance. We explore various approaches to prompt design, such as manual design, optimization algorithms, and evaluation methods, to optimize LLM performance across diverse tasks. Our review covers key research studies in prompt engineering, discussing their methodologies and contributions to the field. We also delve into the challenges faced in evaluating prompt performance, given the absence of a single ”best” prompt and the importance of considering multiple metrics. In conclusion, the paper highlights the critical role of prompt design in harnessing the full potential of LLMs and provides insights into the combination of manual design, optimization techniques, and rigorous evaluation for more effective and efficient use of LLMs in various Natural Language Processing (NLP) tasks.

Keywords: in-context learning, prompt engineering, zero-shot learning, large language models

Procedia PDF Downloads 83
4059 Estimating the Ladder Angle and the Camera Position From a 2D Photograph Based on Applications of Projective Geometry and Matrix Analysis

Authors: Inigo Beckett

Abstract:

In forensic investigations, it is often the case that the most potentially useful recorded evidence derives from coincidental imagery, recorded immediately before or during an incident, and that during the incident (e.g. a ‘failure’ or fire event), the evidence is changed or destroyed. To an image analysis expert involved in photogrammetric analysis for Civil or Criminal Proceedings, traditional computer vision methods involving calibrated cameras is often not appropriate because image metadata cannot be relied upon. This paper presents an approach for resolving this problem, considering in particular and by way of a case study, the angle of a simple ladder shown in a photograph. The UK Health and Safety Executive (HSE) guidance document published in 2014 (INDG455) advises that a leaning ladder should be erected at 75 degrees to the horizontal axis. Personal injury cases can arise in the construction industry because a ladder is too steep or too shallow. Ad-hoc photographs of such ladders in their incident position provide a basis for analysis of their angle. This paper presents a direct approach for ascertaining the position of the camera and the angle of the ladder simultaneously from the photograph(s) by way of a workflow that encompasses a novel application of projective geometry and matrix analysis. Mathematical analysis shows that for a given pixel ratio of directly measured collinear points (i.e. features that lie on the same line segment) from the 2D digital photograph with respect to a given viewing point, we can constrain the 3D camera position to a surface of a sphere in the scene. Depending on what we know about the ladder, we can enforce another independent constraint on the possible camera positions which enables us to constrain the possible positions even further. Experiments were conducted using synthetic and real-world data. The synthetic data modeled a vertical plane with a ladder on a horizontally flat plane resting against a vertical wall. The real-world data was captured using an Apple iPhone 13 Pro and 3D laser scan survey data whereby a ladder was placed in a known location and angle to the vertical axis. For each case, we calculated camera positions and the ladder angles using this method and cross-compared them against their respective ‘true’ values.

Keywords: image analysis, projective geometry, homography, photogrammetry, ladders, Forensics, Mathematical modeling, planar geometry, matrix analysis, collinear, cameras, photographs

Procedia PDF Downloads 52
4058 Accurate Mass Segmentation Using U-Net Deep Learning Architecture for Improved Cancer Detection

Authors: Ali Hamza

Abstract:

Accurate segmentation of breast ultrasound images is of paramount importance in enhancing the diagnostic capabilities of breast cancer detection. This study presents an approach utilizing the U-Net architecture for segmenting breast ultrasound images aimed at improving the accuracy and reliability of mass identification within the breast tissue. The proposed method encompasses a multi-stage process. Initially, preprocessing techniques are employed to refine image quality and diminish noise interference. Subsequently, the U-Net architecture, a deep learning convolutional neural network (CNN), is employed for pixel-wise segmentation of regions of interest corresponding to potential breast masses. The U-Net's distinctive architecture, characterized by a contracting and expansive pathway, enables accurate boundary delineation and detailed feature extraction. To evaluate the effectiveness of the proposed approach, an extensive dataset of breast ultrasound images is employed, encompassing diverse cases. Quantitative performance metrics such as the Dice coefficient, Jaccard index, sensitivity, specificity, and Hausdorff distance are employed to comprehensively assess the segmentation accuracy. Comparative analyses against traditional segmentation methods showcase the superiority of the U-Net architecture in capturing intricate details and accurately segmenting breast masses. The outcomes of this study emphasize the potential of the U-Net-based segmentation approach in bolstering breast ultrasound image analysis. The method's ability to reliably pinpoint mass boundaries holds promise for aiding radiologists in precise diagnosis and treatment planning. However, further validation and integration within clinical workflows are necessary to ascertain their practical clinical utility and facilitate seamless adoption by healthcare professionals. In conclusion, leveraging the U-Net architecture for breast ultrasound image segmentation showcases a robust framework that can significantly enhance diagnostic accuracy and advance the field of breast cancer detection. This approach represents a pivotal step towards empowering medical professionals with a more potent tool for early and accurate breast cancer diagnosis.

Keywords: mage segmentation, U-Net, deep learning, breast cancer detection, diagnostic accuracy, mass identification, convolutional neural network

Procedia PDF Downloads 84
4057 Hope in the Ruins of 'Ozymandias': Reimagining Temporal Horizons in Felicia Hemans 'the Image in Lava'

Authors: Lauren Schuldt Wilson

Abstract:

Felicia Hemans’ memorializing of the unwritten lives of women and the consequent allowance for marginalized voices to remember and be remembered has been considered by many critics in terms of ekphrasis and elegy, terms which privilege the question of whether Hemans’ poeticizing can represent lost voices of history or only her poetic expression. Amy Gates, Brian Elliott, and others point out Hemans’ acknowledgement of the self-projection necessary for imaginatively filling the absences of unrecorded histories. Yet, few have examined the complex temporal positioning Hemans inscribes in these moments of self-projection and imaginative historicizing. In poems like ‘The Image in Lava,’ Hemans maps not only a lost past, but also a lost potential future onto the image of a dead infant in its mother’s arms, the discovery and consideration of which moves the imagined viewer to recover and incorporate the ‘hope’ encapsulated in the figure of the infant into a reevaluation of national time embodied by the ‘relics / Left by the pomps of old.’ By examining Hemans’ acknowledgement and response to Percy Bysshe Shelley’s ‘Ozymandias,’ this essay explores how Hemans’ depictions of imaginative historicizing open new horizons of possibility and reevaluate temporal value structures by imagining previously undiscovered or unexplored potentialities of the past. Where Shelley’s poem mocks the futility of national power and time, this essay outlines Hemans’ suggestion of alternative threads of identity and temporal meaning-making which, regardless of historical veracity, exist outside of and against the structures Shelley challenges. Counter to previous readings of Hemans’ poem as celebration of either recovered or poetically constructed maternal love, this essay argues that Hemans offers a meditation on sites of reproduction—both of personal reproductive futurity and of national reproduction of power. This meditation culminates in Hemans’ gesturing towards a method of historicism by which the imagined viewer reinvigorates the sterile, ‘shattered visage’ of national time by forming temporal identity through the imagining of trans-historical hope inscribed on the infant body of the universal, individual subject rather than the broken monument of the king.

Keywords: futurity, national temporalities, reproduction, revisionary histories

Procedia PDF Downloads 166
4056 Monitoring Deforestation Using Remote Sensing And GIS

Authors: Tejaswi Agarwal, Amritansh Agarwal

Abstract:

Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from Indian institute of remote Sensing (IIRS), Dehradoon in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud free and did not belong to dry and leafless season. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean, we have analysed the change in ground biomass. Through this paper, we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques, it is clearly shown that the total forest cover is continuously degrading and transforming into various land use/land cover category.

Keywords: remote sensing, deforestation, supervised classification, NDVI, change detection

Procedia PDF Downloads 1204
4055 Effects of Extrusion Conditions on the Cooking Properties of Extruded Rice Vermicelli Using Twin-Screw Extrusion

Authors: Hasika Mith, Hassany Ly, Hengsim Phoung, Rathana Sovann, Pichmony Ek, Sokuntheary Theng

Abstract:

Rice is one of the most important crops used in the production of ready-to-cook (RTC) products such as rice vermicelli, noodles, rice paper, Banh Kanh, wine, snacks, and desserts. Meanwhile, extrusion is the most creative food processing method used for developing products with improved nutritional, functional, and sensory properties. This method authorizes process control such as mixing, cooking, and product shaping. Therefore, the objectives of this study were to produce rice vermicelli using a twin screw extruder, and the cooking properties of extruded rice vermicelli were investigated. Response Surface Methodology (RSM) with Box-Behnken design was applied to optimize extrusion conditions in order to achieve the most desirable product characteristics. The feed moisture rate (30–35%), the barrel temperature (90–110°C), and the screw speed (200–400 rpm) all play a big role and have a significant impact on the water absorption index (WAI), cooking yield (CY), and cooking loss (CL) of extrudate rice vermicelli. Results showed that the WAI of the final extruded rice vermicelli ranged between 216.97% and 571.90%. The CY ranged from 147.94 to 203.19%, while the CL ranged from 8.55 to 25.54%. The findings indicated that at a low screw speed or low temperature, there are likely to be more unbroken polymer chains and more hydrophilic groups, which can bind more water and make WAI values higher. The extruded rice vermicelli's cooking yield value had altered considerably after processing under various conditions, proving that the screw speed had little effect on each extruded rice vermicelli's CY. The increase in barrel temperature tended to increase cooking yield and reduce cooking loss. In conclusion, the extrusion processing by a twin-screw extruder had a significant effect on the cooking quality of the rice vermicelli extrudate.

Keywords: cooking loss, cooking quality, cooking yield, extruded rice vermicelli, twin-screw extruder, water absorption index

Procedia PDF Downloads 83
4054 NDVI as a Measure of Change in Forest Biomass

Authors: Amritansh Agarwal, Tejaswi Agarwal

Abstract:

Forest ecosystem plays very important role in the global carbon cycle. It stores about 80% of all above ground and 40% of all below ground terrestrial organic carbon. There is much interest in the extent of tropical forests and their rates of deforestation for two reasons: greenhouse gas contributions and the impact of profoundly negative biodiversity. Deforestation has many ecological, social and economic consequences, one of which is the loss of biological diversity. The rapid deployment of remote sensing (RS) satellites and development of RS analysis techniques in the past three decades have provided a reliable, effective, and practical way to characterize terrestrial ecosystem properties. Global estimates of tropical deforestation vary widely and range from 50,000 to 170,000 km2 /yr Recent FAO tropical deforestation estimates for 1990–1995 cite 116,756km2 / yr globally. Remote Sensing can prove to be a very useful tool in monitoring of forests and associated deforestation to a sufficient level of accuracy without the need of physically surveying the forest areas as many of them are physically inaccessible. The methodology for the assessment of forest cover using digital image processing (ERDAS) has been followed. The satellite data for the study was procured from USGS website in the digital format. While procuring the satellite data, care was taken to ensure that the data was cloud and aerosol free by making using of FLAASH atmospheric correction technique. The Normalized Difference Vegetation Index (NDVI) has been used as a numerical indicator of the reduction in ground biomass. NDVI = (near I.R - Red)/ (near I.R + Red). After calculating the NDVI variations and associated mean we have analysed the change in ground biomass. Through this paper we have tried to indicate the rate of deforestation over a given period of time by comparing the forest cover at different time intervals. With the help of remote sensing and GIS techniques it is clearly shows that the total forest cover is continuously degrading and transforming into various land use/land cover category.

Keywords: remote sensing, deforestation, supervised classification, NDVI change detection

Procedia PDF Downloads 402
4053 Arc Plasma Application for Solid Waste Processing

Authors: Vladimir Messerle, Alfred Mosse, Alexandr Ustimenko, Oleg Lavrichshev

Abstract:

Hygiene and sanitary study of typical medical-biological waste made in Kazakhstan, Russia, Belarus and other countries show that their risk to the environment is much higher than that of most chemical wastes. For example, toxicity of solid waste (SW) containing cytotoxic drugs and antibiotics is comparable to toxicity of radioactive waste of high and medium level activity. This report presents the results of the thermodynamic analysis of thermal processing of SW and experiments at the developed plasma unit for SW processing. Thermodynamic calculations showed that the maximum yield of the synthesis gas at plasma gasification of SW in air and steam mediums is achieved at a temperature of 1600K. At the air plasma gasification of SW high-calorific synthesis gas with a concentration of 82.4% (СO – 31.7%, H2 – 50.7%) can be obtained, and at the steam plasma gasification – with a concentration of 94.5% (СO – 33.6%, H2 – 60.9%). Specific heat of combustion of the synthesis gas produced by air gasification amounts to 14267 kJ/kg, while by steam gasification - 19414 kJ/kg. At the optimal temperature (1600 K), the specific power consumption for air gasification of SW constitutes 1.92 kWh/kg, while for steam gasification - 2.44 kWh/kg. Experimental study was carried out in a plasma reactor. This is device of periodic action. The arc plasma torch of 70 kW electric power is used for SW processing. Consumption of SW was 30 kg/h. Flow of plasma-forming air was 12 kg/h. Under the influence of air plasma flame weight average temperature in the chamber reaches 1800 K. Gaseous products are taken out of the reactor into the flue gas cooling unit, and the condensed products accumulate in the slag formation zone. The cooled gaseous products enter the gas purification unit, after which via gas sampling system is supplied to the analyzer. Ventilation system provides a negative pressure in the reactor up to 10 mm of water column. Condensed products of SW processing are removed from the reactor after its stopping. By the results of experiments on SW plasma gasification the reactor operating conditions were determined, the exhaust gas analysis was performed and the residual carbon content in the slag was determined. Gas analysis showed the following composition of the gas at the exit of gas purification unit, (vol.%): СO – 26.5, H2 – 44.6, N2–28.9. The total concentration of the syngas was 71.1%, which agreed well with the thermodynamic calculations. The discrepancy between experiment and calculation by the yield of the target syngas did not exceed 16%. Specific power consumption for SW gasification in the plasma reactor according to the results of experiments amounted to 2.25 kWh/kg of working substance. No harmful impurities were found in both gas and condensed products of SW plasma gasification. Comparison of experimental results and calculations showed good agreement. Acknowledgement—This work was supported by Ministry of Education and Science of the Republic of Kazakhstan and Ministry of Education and Science of the Russian Federation (Agreement on grant No. 14.607.21.0118, project RFMEF160715X0118).

Keywords: coal, efficiency, ignition, numerical modeling, plasma-fuel system, plasma generator

Procedia PDF Downloads 250
4052 Optimizing Telehealth Internet of Things Integration: A Sustainable Approach through Fog and Cloud Computing Platforms for Energy Efficiency

Authors: Yunyong Guo, Sudhakar Ganti, Bryan Guo

Abstract:

The swift proliferation of telehealth Internet of Things (IoT) devices has sparked concerns regarding energy consumption and the need for streamlined data processing. This paper presents an energy-efficient model that integrates telehealth IoT devices into a platform based on fog and cloud computing. This integrated system provides a sustainable and robust solution to address the challenges. Our model strategically utilizes fog computing as a localized data processing layer and leverages cloud computing for resource-intensive tasks, resulting in a significant reduction in overall energy consumption. The incorporation of adaptive energy-saving strategies further enhances the efficiency of our approach. Simulation analysis validates the effectiveness of our model in improving energy efficiency for telehealth IoT systems, particularly when integrated with localized fog nodes and both private and public cloud infrastructures. Subsequent research endeavors will concentrate on refining the energy-saving model, exploring additional functional enhancements, and assessing its broader applicability across various healthcare and industry sectors.

Keywords: energy-efficient, fog computing, IoT, telehealth

Procedia PDF Downloads 76
4051 Application Methodology for the Generation of 3D Thermal Models Using UAV Photogrammety and Dual Sensors for Mining/Industrial Facilities Inspection

Authors: Javier Sedano-Cibrián, Julio Manuel de Luis-Ruiz, Rubén Pérez-Álvarez, Raúl Pereda-García, Beatriz Malagón-Picón

Abstract:

Structural inspection activities are necessary to ensure the correct functioning of infrastructures. Unmanned Aerial Vehicle (UAV) techniques have become more popular than traditional techniques. Specifically, UAV Photogrammetry allows time and cost savings. The development of this technology has permitted the use of low-cost thermal sensors in UAVs. The representation of 3D thermal models with this type of equipment is in continuous evolution. The direct processing of thermal images usually leads to errors and inaccurate results. A methodology is proposed for the generation of 3D thermal models using dual sensors, which involves the application of visible Red-Blue-Green (RGB) and thermal images in parallel. Hence, the RGB images are used as the basis for the generation of the model geometry, and the thermal images are the source of the surface temperature information that is projected onto the model. Mining/industrial facilities representations that are obtained can be used for inspection activities.

Keywords: aerial thermography, data processing, drone, low-cost, point cloud

Procedia PDF Downloads 143
4050 Rapid Flood Damage Assessment of Population and Crops Using Remotely Sensed Data

Authors: Urooj Saeed, Sajid Rashid Ahmad, Iqra Khalid, Sahar Mirza, Imtiaz Younas

Abstract:

Pakistan, a flood-prone country, has experienced worst floods in the recent past which have caused extensive damage to the urban and rural areas by loss of lives, damage to infrastructure and agricultural fields. Poor flood management system in the country has projected the risks of damages as the increasing frequency and magnitude of floods are felt as a consequence of climate change; affecting national economy directly or indirectly. To combat the needs of flood emergency, this paper focuses on remotely sensed data based approach for rapid mapping and monitoring of flood extent and its damages so that fast dissemination of information can be done, from local to national level. In this research study, spatial extent of the flooding caused by heavy rains of 2014 has been mapped by using space borne data to assess the crop damages and affected population in sixteen districts of Punjab. For this purpose, moderate resolution imaging spectroradiometer (MODIS) was used to daily mark the flood extent by using Normalised Difference Water Index (NDWI). The highest flood value data was integrated with the LandScan 2014, 1km x 1km grid based population, to calculate the affected population in flood hazard zone. It was estimated that the floods covered an area of 16,870 square kilometers, with 3.0 million population affected. Moreover, to assess the flood damages, Object Based Image Analysis (OBIA) aided with spectral signatures was applied on Landsat image to attain the thematic layers of healthy (0.54 million acre) and damaged crops (0.43 million acre). The study yields that the population of Jhang district (28% of 2.5 million population) was affected the most. Whereas, in terms of crops, Jhang and Muzzafargarh are the ‘highest damaged’ ranked district of floods 2014 in Punjab. This study was completed within 24 hours of the peak flood time, and proves to be an effective methodology for rapid assessment of damages due to flood hazard

Keywords: flood hazard, space borne data, object based image analysis, rapid damage assessment

Procedia PDF Downloads 328
4049 The Role of Executive Attention and Literacy on Consumer Memory

Authors: Fereshteh Nazeri Bahadori

Abstract:

In today's competitive environment, any company that aims to operate in a market, whether industrial or consumer markets, must know that it cannot address all the tastes and demands of customers at once and serve them all. The study of consumer memory is considered an important subject in marketing research, and many companies have conducted studies on this subject and the factors affecting it due to its importance. Therefore, the current study tries to investigate the relationship between consumers' attention, literacy, and memory. Memory has a very close relationship with learning. Memory is the collection of all the information that we have understood and stored. One of the important subjects in consumer behavior is information processing by the consumer. One of the important factors in information processing is the mental involvement of the consumer, which has attracted a lot of attention in the past two decades. Since consumers are the turning point of all marketing activities, successful marketing begins with understanding why and how consumers behave. Therefore, in the current study, the role of executive attention and literacy on consumers' memory has been investigated. The results showed that executive attention and literacy would play a significant role in the long-term and short-term memory of consumers.

Keywords: literacy, consumer memory, executive attention, psychology of consumer behavior

Procedia PDF Downloads 96
4048 Performance Evaluation of Refinement Method for Wideband Two-Beams Formation

Authors: C. Bunsanit

Abstract:

This paper presents the refinement method for two beams formation of wideband smart antenna. The refinement method for weighting coefficients is based on Fully Spatial Signal Processing by taking Inverse Discrete Fourier Transform (IDFT), and its simulation results are presented using MATLAB. The radiation pattern is created by multiplying the incoming signal with real weights and then summing them together. These real weighting coefficients are computed by IDFT method; however, the range of weight values is relatively wide. Therefore, for reducing this range, the refinement method is used. The radiation pattern concerns with five input parameters to control. These parameters are maximum weighting coefficient, wideband signal, direction of mainbeam, beamwidth, and maximum of minor lobe level. Comparison of the obtained simulation results between using refinement method and taking only IDFT shows that the refinement method works well for wideband two beams formation.

Keywords: fully spatial signal processing, beam forming, refinement method, smart antenna, weighting coefficient, wideband

Procedia PDF Downloads 226
4047 Rapid Fetal MRI Using SSFSE, FIESTA and FSPGR Techniques

Authors: Chen-Chang Lee, Po-Chou Chen, Jo-Chi Jao, Chun-Chung Lui, Leung-Chit Tsang, Lain-Chyr Hwang

Abstract:

Fetal Magnetic Resonance Imaging (MRI) is a challenge task because the fetal movements could cause motion artifact in MR images. The remedy to overcome this problem is to use fast scanning pulse sequences. The Single-Shot Fast Spin-Echo (SSFSE) T2-weighted imaging technique is routinely performed and often used as a gold standard in clinical examinations. Fast spoiled gradient-echo (FSPGR) T1-Weighted Imaging (T1WI) is often used to identify fat, calcification and hemorrhage. Fast Imaging Employing Steady-State Acquisition (FIESTA) is commonly used to identify fetal structures as well as the heart and vessels. The contrast of FIESTA image is related to T1/T2 and is different from that of SSFSE. The advantages and disadvantages of these two scanning sequences for fetal imaging have not been clearly demonstrated yet. This study aimed to compare these three rapid MRI techniques (SSFSE, FIESTA, and FSPGR) for fetal MRI examinations. The image qualities and influencing factors among these three techniques were explored. A 1.5T GE Discovery 450 clinical MR scanner with an eight-channel high-resolution abdominal coil was used in this study. Twenty-five pregnant women were recruited to enroll fetal MRI examination with SSFSE, FIESTA and FSPGR scanning. Multi-oriented and multi-slice images were acquired. Afterwards, MR images were interpreted and scored by two senior radiologists. The results showed that both SSFSE and T2W-FIESTA can provide good image quality among these three rapid imaging techniques. Vessel signals on FIESTA images are higher than those on SSFSE images. The Specific Absorption Rate (SAR) of FIESTA is lower than that of the others two techniques, but it is prone to cause banding artifacts. FSPGR-T1WI renders lower Signal-to-Noise Ratio (SNR) because it severely suffers from the impact of maternal and fetal movements. The scan times for these three scanning sequences were 25 sec (T2W-SSFSE), 20 sec (FIESTA) and 18 sec (FSPGR). In conclusion, all these three rapid MR scanning sequences can produce high contrast and high spatial resolution images. The scan time can be shortened by incorporating parallel imaging techniques so that the motion artifacts caused by fetal movements can be reduced. Having good understanding of the characteristics of these three rapid MRI techniques is helpful for technologists to obtain reproducible fetal anatomy images with high quality for prenatal diagnosis.

Keywords: fetal MRI, FIESTA, FSPGR, motion artifact, SSFSE

Procedia PDF Downloads 530
4046 Video Compression Using Contourlet Transform

Authors: Delara Kazempour, Mashallah Abasi Dezfuli, Reza Javidan

Abstract:

Video compression used for channels with limited bandwidth and storage devices has limited storage capabilities. One of the most popular approaches in video compression is the usage of different transforms. Discrete cosine transform is one of the video compression methods that have some problems such as blocking, noising and high distortion inappropriate effect in compression ratio. wavelet transform is another approach is better than cosine transforms in balancing of compression and quality but the recognizing of curve curvature is so limit. Because of the importance of the compression and problems of the cosine and wavelet transforms, the contourlet transform is most popular in video compression. In the new proposed method, we used contourlet transform in video image compression. Contourlet transform can save details of the image better than the previous transforms because this transform is multi-scale and oriented. This transform can recognize discontinuity such as edges. In this approach we lost data less than previous approaches. Contourlet transform finds discrete space structure. This transform is useful for represented of two dimension smooth images. This transform, produces compressed images with high compression ratio along with texture and edge preservation. Finally, the results show that the majority of the images, the parameters of the mean square error and maximum signal-to-noise ratio of the new method based contourlet transform compared to wavelet transform are improved but in most of the images, the parameters of the mean square error and maximum signal-to-noise ratio in the cosine transform is better than the method based on contourlet transform.

Keywords: video compression, contourlet transform, discrete cosine transform, wavelet transform

Procedia PDF Downloads 444
4045 Effect of Some Metal Ions on the Activity of Lipase Produced by Aspergillus Niger Cultured on Vitellaria Paradoxa Shells

Authors: Abdulhakeem Sulyman, Olukotun Zainab, Hammed Abdulquadri

Abstract:

Lipases (triacylglycerol acyl hydrolases) (EC 3.1.1.3) are class of enzymes that catalyses the hydrolysis of triglycerides to glycerol and free fatty acids. They account for up to 10% of the enzyme in the market and have a wide range of applications in biofuel production, detergent formulation, leather processing and in food and feed processing industry. This research was conducted to study the effect of some metal ions on the activity of purified lipase produced by Aspergillus niger cultured on Vitellaria paradoxa shells. Purified lipase in 12.5 mM p-NPL was incubated with different metal ions (Zn²⁺, Ca²⁺, Mn²⁺, Fe²⁺, Na⁺, K⁺ and Mg²⁺). The final concentrations of metal ions investigated were 0.1, 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9 and 1.0 mM. The results obtained from the study showed that Zn²⁺, Ca²⁺, Mn²⁺ and Fe²⁺ ions increased the activity of lipase up to 3.0, 3.0, 1.0, and 26.0 folds respectively. Lipase activity was partially inhibited by Na⁺ and Mg²⁺ with up to 88.5% and 83.7% loss of activity respectively. Lipase activity was also inhibited by K⁺ with up to 56.7% loss in the activity as compared to in the absence of metal ions. The study concluded that lipase produced by Aspergillus niger cultured on Vitellaria paradoxa shells can be activated by the presence of Zn²⁺, Ca²⁺, Mn²⁺ and Fe²⁺ and inhibited by Na⁺, K⁺ and Mg²⁺.

Keywords: Aspergillus niger, Vitellaria paradoxa, lipase, metal ions

Procedia PDF Downloads 150
4044 Online Monitoring Rheological Property of Polymer Melt during Injection Molding

Authors: Chung-Chih Lin, Chien-Liang Wu

Abstract:

The detection of the polymer melt state during manufacture process is regarded as an efficient way to control the molded part quality in advance. Online monitoring rheological property of polymer melt during processing procedure provides an approach to understand the melt state immediately. Rheological property reflects the polymer melt state at different processing parameters and is very important in injection molding process especially. An approach that demonstrates how to calculate rheological property of polymer melt through in-process measurement, using injection molding as an example, is proposed in this study. The system consists of two sensors and a data acquisition module can process the measured data, which are used for the calculation of rheological properties of polymer melt. The rheological properties of polymer melt discussed in this study include shear rate and viscosity which are investigated with respect to injection speed and melt temperature. The results show that the effect of injection speed on the rheological properties is apparent, especially for high melt temperature and should be considered for precision molding process.

Keywords: injection molding, melt viscosity, shear rate, monitoring

Procedia PDF Downloads 381
4043 E-Learning Platform for School Kids

Authors: Gihan Thilakarathna, Fernando Ishara, Rathnayake Yasith, Bandara A. M. R. Y.

Abstract:

E-learning is a crucial component of intelligent education. Even in the midst of a pandemic, E-learning is becoming increasingly important in the educational system. Several e-learning programs are accessible for students. Here, we decided to create an e-learning framework for children. We've found a few issues that teachers are having with their online classes. When there are numerous students in an online classroom, how does a teacher recognize a student's focus on academics and below-the-surface behaviors? Some kids are not paying attention in class, and others are napping. The teacher is unable to keep track of each and every student. Key challenge in e-learning is online exams. Because students can cheat easily during online exams. Hence there is need of exam proctoring is occurred. In here we propose an automated online exam cheating detection method using a web camera. The purpose of this project is to present an E-learning platform for math education and include games for kids as an alternative teaching method for math students. The game will be accessible via a web browser. The imagery in the game is drawn in a cartoonish style. This will help students learn math through games. Everything in this day and age is moving towards automation. However, automatic answer evaluation is only available for MCQ-based questions. As a result, the checker has a difficult time evaluating the theory solution. The current system requires more manpower and takes a long time to evaluate responses. It's also possible to mark two identical responses differently and receive two different grades. As a result, this application employs machine learning techniques to provide an automatic evaluation of subjective responses based on the keyword provided to the computer as student input, resulting in a fair distribution of marks. In addition, it will save time and manpower. We used deep learning, machine learning, image processing and natural language technologies to develop these research components.

Keywords: math, education games, e-learning platform, artificial intelligence

Procedia PDF Downloads 156
4042 An Investigation into Computer Vision Methods to Identify Material Other Than Grapes in Harvested Wine Grape Loads

Authors: Riaan Kleyn

Abstract:

Mass wine production companies across the globe are provided with grapes from winegrowers that predominantly utilize mechanical harvesting machines to harvest wine grapes. Mechanical harvesting accelerates the rate at which grapes are harvested, allowing grapes to be delivered faster to meet the demands of wine cellars. The disadvantage of the mechanical harvesting method is the inclusion of material-other-than-grapes (MOG) in the harvested wine grape loads arriving at the cellar which degrades the quality of wine that can be produced. Currently, wine cellars do not have a method to determine the amount of MOG present within wine grape loads. This paper seeks to find an optimal computer vision method capable of detecting the amount of MOG within a wine grape load. A MOG detection method will encourage winegrowers to deliver MOG-free wine grape loads to avoid penalties which will indirectly enhance the quality of the wine to be produced. Traditional image segmentation methods were compared to deep learning segmentation methods based on images of wine grape loads that were captured at a wine cellar. The Mask R-CNN model with a ResNet-50 convolutional neural network backbone emerged as the optimal method for this study to determine the amount of MOG in an image of a wine grape load. Furthermore, a statistical analysis was conducted to determine how the MOG on the surface of a grape load relates to the mass of MOG within the corresponding grape load.

Keywords: computer vision, wine grapes, machine learning, machine harvested grapes

Procedia PDF Downloads 94
4041 Characteristics of Himalayan Glaciers with Lakes, Kosi Sub-Basin, Ganga Basin: Based on Remote Sensing and GIS Techniques

Authors: Ram Moorat Singh, Arun Kumar Sharma, Ravi Chaurey

Abstract:

Assessment of characteristics of Himalayan glaciers with or without glacier lakes was carried out for 1937glaciers of Kosi sub-basin, Ganga basin by using remote sensing and GIS techniques. Analysis of IRS-P6 AWiFS Data of 2004-07 periods, SRTM DEM and MODIS Land Surface Temperature (LST) data (15year mean) using image processing and GIS tools has provided significant information on various glacier parameters. The glacier area, length, width, ice exposed area, debris cover area, glacier slope, orientation, elevation and temperature data was analysed. The 119 supra glacier lakes and 62 moraine dam/peri-glacier lakes (area > 0.02 km2) in the study were studied to discern the suitable glacier conditions for glacier lake formation. On analysis it is observed that the glacial lakes are preferably formed in association with large dimension glaciers (area, length and width), glaciers with higher percent ice exposed area, lower percent debris cover area and in general mean elevation value greater than 5300 m amsl. On analysis of lake type shows that the moraine dam lakes are formed associated with glaciers located at relatively higher altitude as compared to altitude of glaciers with supra glacier lakes. Analysis of frequency of occurrence of lakes vis a vis glacier orientation shows that more number of glacier lakes are formed associated with glaciers having orientation south, south east, south west, east and west directions. The supra glacial lakes are formed in association with glaciers having higher mean temperature as compared to moraine dam lakes as verified using LST data of 15 years (2000-2014).

Keywords: remote sensing, supra glacial lake, Himalaya, Kosi sub-basin, glaciers, moraine-dammed lake

Procedia PDF Downloads 379
4040 Efficient of Technology Remediation Soil That Contaminated by Petroleum Based on Heat without Combustion

Authors: Gavin Hutama Farandiarta, Hegi Adi Prabowo, Istiara Rizqillah Hanifah, Millati Hanifah Saprudin, Raden Iqrafia Ashna

Abstract:

The increase of the petroleum’s consumption rate encourages industries to optimize and increase the activity in processing crude oil into petroleum. However, although the result gives a lot of benefits to humans worldwide, it also gives negative impact to the environment. One of the negative impacts of processing crude oil is the soil will be contaminated by petroleum sewage sludge. This petroleum sewage sludge, contains hydrocarbon compound and it can be calculated by Total Petroleum Hydrocarbon (TPH).Petroleum sludge waste is accounted as hazardous and toxic. The soil contamination caused by the petroleum sludge is very hard to get rid of. However, there is a way to manage the soil that is contaminated by petroleum sludge, which is by using heat (thermal desorption) in the process of remediation. There are several factors that affect the success rate of the remediation with the help of heat which are temperature, time, and air pressure in the desorption column. The remediation process using the help of heat is an alternative in soil recovery from the petroleum pollution which highly effective, cheap, and environmentally friendly that produces uncontaminated soil and the petroleum that can be used again.

Keywords: petroleum sewage sludge, remediation soil, thermal desorption, total petroleum hydrocarbon (TPH)

Procedia PDF Downloads 247
4039 Artificial Intelligence and Distributed System Computing: Application and Practice in Real Life

Authors: Lai Junzhe, Wang Lihao, Burra Venkata Durga Kumar

Abstract:

In recent years, due to today's global technological advances, big data and artificial intelligence technologies have been widely used in various industries and fields, playing an important role in reducing costs and increasing efficiency. Among them, artificial intelligence has derived another branch in its own continuous progress and the continuous development of computer personnel, namely distributed artificial intelligence computing systems. Distributed AI is a method for solving complex learning, decision-making, and planning problems, characterized by the ability to take advantage of large-scale computation and the spatial distribution of resources, and accordingly, it can handle problems with large data sets. Nowadays, distributed AI is widely used in military, medical, and human daily life and brings great convenience and efficient operation to life. In this paper, we will discuss three areas of distributed AI computing systems in vision processing, blockchain, and smart home to introduce the performance of distributed systems and the role of AI in distributed systems.

Keywords: distributed system, artificial intelligence, blockchain, IoT, visual information processing, smart home

Procedia PDF Downloads 113
4038 Exploring the Intersection Between the General Data Protection Regulation and the Artificial Intelligence Act

Authors: Maria Jędrzejczak, Patryk Pieniążek

Abstract:

The European legal reality is on the eve of significant change. In European Union law, there is talk of a “fourth industrial revolution”, which is driven by massive data resources linked to powerful algorithms and powerful computing capacity. The above is closely linked to technological developments in the area of artificial intelligence, which has prompted an analysis covering both the legal environment as well as the economic and social impact, also from an ethical perspective. The discussion on the regulation of artificial intelligence is one of the most serious yet widely held at both European Union and Member State level. The literature expects legal solutions to guarantee security for fundamental rights, including privacy, in artificial intelligence systems. There is no doubt that personal data have been increasingly processed in recent years. It would be impossible for artificial intelligence to function without processing large amounts of data (both personal and non-personal). The main driving force behind the current development of artificial intelligence is advances in computing, but also the increasing availability of data. High-quality data are crucial to the effectiveness of many artificial intelligence systems, particularly when using techniques involving model training. The use of computers and artificial intelligence technology allows for an increase in the speed and efficiency of the actions taken, but also creates security risks for the data processed of an unprecedented magnitude. The proposed regulation in the field of artificial intelligence requires analysis in terms of its impact on the regulation on personal data protection. It is necessary to determine what the mutual relationship between these regulations is and what areas are particularly important in the personal data protection regulation for processing personal data in artificial intelligence systems. The adopted axis of considerations is a preliminary assessment of two issues: 1) what principles of data protection should be applied in particular during processing personal data in artificial intelligence systems, 2) what regulation on liability for personal data breaches is in such systems. The need to change the regulations regarding the rights and obligations of data subjects and entities processing personal data cannot be excluded. It is possible that changes will be required in the provisions regarding the assignment of liability for a breach of personal data protection processed in artificial intelligence systems. The research process in this case concerns the identification of areas in the field of personal data protection that are particularly important (and may require re-regulation) due to the introduction of the proposed legal regulation regarding artificial intelligence. The main question that the authors want to answer is how the European Union regulation against data protection breaches in artificial intelligence systems is shaping up. The answer to this question will include examples to illustrate the practical implications of these legal regulations.

Keywords: data protection law, personal data, AI law, personal data breach

Procedia PDF Downloads 65
4037 A Neural Network Based Clustering Approach for Imputing Multivariate Values in Big Data

Authors: S. Nickolas, Shobha K.

Abstract:

The treatment of incomplete data is an important step in the data pre-processing. Missing values creates a noisy environment in all applications and it is an unavoidable problem in big data management and analysis. Numerous techniques likes discarding rows with missing values, mean imputation, expectation maximization, neural networks with evolutionary algorithms or optimized techniques and hot deck imputation have been introduced by researchers for handling missing data. Among these, imputation techniques plays a positive role in filling missing values when it is necessary to use all records in the data and not to discard records with missing values. In this paper we propose a novel artificial neural network based clustering algorithm, Adaptive Resonance Theory-2(ART2) for imputation of missing values in mixed attribute data sets. The process of ART2 can recognize learned models fast and be adapted to new objects rapidly. It carries out model-based clustering by using competitive learning and self-steady mechanism in dynamic environment without supervision. The proposed approach not only imputes the missing values but also provides information about handling the outliers.

Keywords: ART2, data imputation, clustering, missing data, neural network, pre-processing

Procedia PDF Downloads 274