Search results for: query processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3741

Search results for: query processing

2691 Digi-Buddy: A Smart Cane with Artificial Intelligence and Real-Time Assistance

Authors: Amaladhithyan Krishnamoorthy, Ruvaitha Banu

Abstract:

Vision is considered as the most important sense in humans, without which leading a normal can be often difficult. There are many existing smart canes for visually impaired with obstacle detection using ultrasonic transducer to help them navigate. Though the basic smart cane increases the safety of the users, it does not help in filling the void of visual loss. This paper introduces the concept of Digi-Buddy which is an evolved smart cane for visually impaired. The cane consists for several modules, apart from the basic obstacle detection features; the Digi-Buddy assists the user by capturing video/images and streams them to the server using a wide-angled camera, which then detects the objects using Deep Convolutional Neural Network. In addition to determining what the particular image/object is, the distance of the object is assessed by the ultrasonic transducer. The sound generation application, modelled with the help of Natural Language Processing is used to convert the processed images/object into audio. The object detected is signified by its name which is transmitted to the user with the help of Bluetooth hear phones. The object detection is extended to facial recognition which maps the faces of the person the user meets in the database of face images and alerts the user about the person. One of other crucial function consists of an automatic-intimation-alarm which is triggered when the user is in an emergency. If the user recovers within a set time, a button is provisioned in the cane to stop the alarm. Else an automatic intimation is sent to friends and family about the whereabouts of the user using GPS. In addition to safety and security by the existing smart canes, the proposed concept devices to be implemented as a prototype helping visually-impaired visualize their surroundings through audio more in an amicable way.

Keywords: artificial intelligence, facial recognition, natural language processing, internet of things

Procedia PDF Downloads 331
2690 Audio-Visual Co-Data Processing Pipeline

Authors: Rita Chattopadhyay, Vivek Anand Thoutam

Abstract:

Speech is the most acceptable means of communication where we can quickly exchange our feelings and thoughts. Quite often, people can communicate orally but cannot interact or work with computers or devices. It’s easy and quick to give speech commands than typing commands to computers. In the same way, it’s easy listening to audio played from a device than extract output from computers or devices. Especially with Robotics being an emerging market with applications in warehouses, the hospitality industry, consumer electronics, assistive technology, etc., speech-based human-machine interaction is emerging as a lucrative feature for robot manufacturers. Considering this factor, the objective of this paper is to design the “Audio-Visual Co-Data Processing Pipeline.” This pipeline is an integrated version of Automatic speech recognition, a Natural language model for text understanding, object detection, and text-to-speech modules. There are many Deep Learning models for each type of the modules mentioned above, but OpenVINO Model Zoo models are used because the OpenVINO toolkit covers both computer vision and non-computer vision workloads across Intel hardware and maximizes performance, and accelerates application development. A speech command is given as input that has information about target objects to be detected and start and end times to extract the required interval from the video. Speech is converted to text using the Automatic speech recognition QuartzNet model. The summary is extracted from text using a natural language model Generative Pre-Trained Transformer-3 (GPT-3). Based on the summary, essential frames from the video are extracted, and the You Only Look Once (YOLO) object detection model detects You Only Look Once (YOLO) objects on these extracted frames. Frame numbers that have target objects (specified objects in the speech command) are saved as text. Finally, this text (frame numbers) is converted to speech using text to speech model and will be played from the device. This project is developed for 80 You Only Look Once (YOLO) labels, and the user can extract frames based on only one or two target labels. This pipeline can be extended for more than two target labels easily by making appropriate changes in the object detection module. This project is developed for four different speech command formats by including sample examples in the prompt used by Generative Pre-Trained Transformer-3 (GPT-3) model. Based on user preference, one can come up with a new speech command format by including some examples of the respective format in the prompt used by the Generative Pre-Trained Transformer-3 (GPT-3) model. This pipeline can be used in many projects like human-machine interface, human-robot interaction, and surveillance through speech commands. All object detection projects can be upgraded using this pipeline so that one can give speech commands and output is played from the device.

Keywords: OpenVINO, automatic speech recognition, natural language processing, object detection, text to speech

Procedia PDF Downloads 60
2689 Friction Stir Processing of the AA7075T7352 Aluminum Alloy Microstructures Mechanical Properties and Texture Characteristics

Authors: Roopchand Tandon, Zaheer Khan Yusufzai, R. Manna, R. K. Mandal

Abstract:

Present work describes microstructures, mechanical properties, and texture characteristics of the friction stir processed AA7075T7352 aluminum alloy. Phases were analyzed with the help of x-ray diffractometre (XRD), transmission electron microscope (TEM) along with the differential scanning calorimeter (DSC). Depth-wise microstructures and dislocation characteristics from the nugget-zone of the friction stir processed specimens were studied using the bright field (BF) and weak beam dark-field (WBDF) TEM micrographs, and variation in the microstructures as well as dislocation characteristics were the noteworthy features found. XRD analysis display changes in the chemistry as well as size of the phases in the nugget and heat affected zones (Nugget and HAZ). Whereas the base metal (BM) microstructures remain un-affected. High density dislocations were noticed in the nugget regions of the processed specimen, along with the formation of dislocation contours and tangles. .The ɳ’ and ɳ phases, along with the GP-Zones were completely dissolved and trapped by the dislocations. Such an observations got corroborated to the improved mechanical as well as stress corrosion cracking (SCC) performances. Bulk texture and residual stress measurements were done by the Panalytical Empyrean MRD system with Co- kα radiation. Nugget zone (NZ) display compressive residual stress as compared to thermo-mechanically(TM) and heat affected zones (HAZ). Typical f.c.c. deformation texture components (e.g. Copper, Brass, and Goss) were seen. Such a phenomenon is attributed to the enhanced hardening as well as other mechanical performance of the alloy. Mechanical characterizations were done using the tensile test and Anton Paar Instrumented Micro Hardness tester. Enhancement in the yield strength value is reported from the 89MPa to the 170MPa; on the other hand, highest hardness value was reported in the nugget-zone of the processed specimens.

Keywords: aluminum alloy, mechanical characterization, texture characterstics, friction stir processing

Procedia PDF Downloads 80
2688 Detecting Hate Speech And Cyberbullying Using Natural Language Processing

Authors: Nádia Pereira, Paula Ferreira, Sofia Francisco, Sofia Oliveira, Sidclay Souza, Paula Paulino, Ana Margarida Veiga Simão

Abstract:

Social media has progressed into a platform for hate speech among its users, and thus, there is an increasing need to develop automatic detection classifiers of offense and conflicts to help decrease the prevalence of such incidents. Online communication can be used to intentionally harm someone, which is why such classifiers could be essential in social networks. A possible application of these classifiers is the automatic detection of cyberbullying. Even though identifying the aggressive language used in online interactions could be important to build cyberbullying datasets, there are other criteria that must be considered. Being able to capture the language, which is indicative of the intent to harm others in a specific context of online interaction is fundamental. Offense and hate speech may be the foundation of online conflicts, which have become commonly used in social media and are an emergent research focus in machine learning and natural language processing. This study presents two Portuguese language offense-related datasets which serve as examples for future research and extend the study of the topic. The first is similar to other offense detection related datasets and is entitled Aggressiveness dataset. The second is a novelty because of the use of the history of the interaction between users and is entitled the Conflicts/Attacks dataset. Both datasets were developed in different phases. Firstly, we performed a content analysis of verbal aggression witnessed by adolescents in situations of cyberbullying. Secondly, we computed frequency analyses from the previous phase to gather lexical and linguistic cues used to identify potentially aggressive conflicts and attacks which were posted on Twitter. Thirdly, thorough annotation of real tweets was performed byindependent postgraduate educational psychologists with experience in cyberbullying research. Lastly, we benchmarked these datasets with other machine learning classifiers.

Keywords: aggression, classifiers, cyberbullying, datasets, hate speech, machine learning

Procedia PDF Downloads 206
2687 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach

Authors: Sarisa Pinkham, Kanyarat Bussaban

Abstract:

The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.

Keywords: daily rainfall, image processing, approximation, pixel value data

Procedia PDF Downloads 370
2686 The Impact of Legislation on Waste and Losses in the Food Processing Sector in the UK/EU

Authors: David Lloyd, David Owen, Martin Jardine

Abstract:

Introduction: European weight regulations with respect to food products require a full understanding of regulation guidelines to assure regulatory compliance. It is suggested that the complexity of regulation leads to practices which result to over filling of food packages by food processors. Purpose: To establish current practices by food processors and the financial, sustainable and societal impacts on the food supply chain of ineffective food production practices. Methods: An analysis of food packing controls with 10 companies of varying food categories and quantitative based research of a further 15 food processes on the confidence in weight control analysis of finished food packs within their organisation. Results: A process floor analysis of manufacturing operations focussing on 10 products found over fill of packages ranging from 4.8% to 20.2%. Standard deviation figures for all products showed a potential for reducing average weight of the pack whilst still retain the legal status of the product. In 20% of cases, an automatic weight analysis machine was in situ however weight packs were still significantly overweight. Collateral impacts noted included the effect of overfill on raw material purchase and added food miles often on a global basis with one raw material alone creating 10,000 extra food miles due to the poor weight control of the processing unit. A case study of a meat and bakery product will be discussed with the impact of poor controls resulting from complex legislation. The case studies will highlight extra energy costs in production and the impact of the extra weight on fuel usage. If successful a risk assessment model used primarily on food safety but adapted to identify waste /sustainability risks will be discussed within the presentation.

Keywords: legislation, overfill, profile, waste

Procedia PDF Downloads 384
2685 Greening the Blue: Enzymatic Degradation of Commercially Important Biopolymer Dextran Using Dextranase from Bacillus Licheniformis KIBGE-IB25

Authors: Rashida Rahmat Zohra, Afsheen Aman, Shah Ali Ul Qader

Abstract:

Commercially important biopolymer, dextran, is enzymatically degraded into lower molecular weight fractions of vast industrial potential. Various organisms are associated with dextranase production, among which fungal, yeast and bacterial origins are used for commercial production. Dextranases are used to remove contaminating dextran in sugar processing industry and also used in oral care products for efficient removal of dental plaque. Among the hydrolytic products of dextran, isomaltooligosaccharides have prebiotic effect in humans and reduces the cariogenic effect of sucrose in oral cavity. Dextran derivatives produced by hydrolysis of high molecular polymer are also conjugated with other chemical and metallic compounds for usage in pharmaceutical, fine chemical industry, cosmetics, and food industry. Owing to the vast application of dextran and dextranases, current study focused on purification and analysis of kinetic parameters of dextranase from a newly isolated strain of Bacillus licheniformis KIBGE-IB25. Dextranase was purified up to 35.75 folds with specific activity of 1405 U/mg and molecular weight of 158 kDa. Analysis of kinetic parameters revealed that dextranase performs optimum cleavage of low molecular weight dextran (5000 Da, 0.5%) at 35ºC in 15 min at pH 4.5 with a Km and Vmax of 0.3738 mg/ml and 182.0 µmol/min, respectively. Thermal stability profiling of dextranase showed that it retained 80% activity up to 6 hours at 30-35ºC and remains 90% active at pH 4.5. In short, the dextranase reported here performs rapid cleavage of substrate at mild operational conditions which makes it an ideal candidate for dextran removal in sugar processing industry and for commercial production of low molecular weight oligosaccharides.

Keywords: Bacillus licheniformis, dextranase, gel permeation chromatograpy, enzyme purification, enzyme kinetics

Procedia PDF Downloads 421
2684 Executive Deficits in Non-Clinical Hoarders

Authors: Thomas Heffernan, Nick Neave, Colin Hamilton, Gill Case

Abstract:

Hoarding is the acquisition of and failure to discard possessions, leading to excessive clutter and significant psychological/emotional distress. From a cognitive-behavioural approach, excessive hoarding arises from information-processing deficits, as well as from problems with emotional attachment to possessions and beliefs about the nature of possessions. In terms of information processing, hoarders have shown deficits in executive functions, including working memory, planning, inhibitory control, and cognitive flexibility. However, this previous research is often confounded by co-morbid factors such as anxiety, depression, or obsessive-compulsive disorder. The current study adopted a cognitive-behavioural approach, specifically assessing executive deficits and working memory in a non-clinical sample of hoarders, compared with non-hoarders. In this study, a non-clinical sample of 40 hoarders and 73 non-hoarders (defined by The Savings Inventory-Revised) completed the Adult Executive Functioning Inventory, which measures working memory and inhibition, Dysexecutive Questionnaire-Revised, which measures general executive function and the Hospital Anxiety and Depression Scale, which measures mood. The participant sample was made up of unpaid young adult volunteers who were undergraduate students and who completed the questionnaires on a university campus. The results revealed that, after observing no differences between hoarders and non-hoarders on age, sex, and mood, hoarders reported significantly more deficits in inhibitory control and general executive function when compared with non-hoarders. There was no between-group difference on general working memory. This suggests that non-clinical hoarders have a specific difficulty with inhibition-control, which enables you to resist repeated, unwanted urges. This might explain the hoarder’s inability to resist urges to buy and keep items that are no longer of any practical use. These deficits may be underpinned by general executive function deficiencies.

Keywords: hoarding, memory, executive, deficits

Procedia PDF Downloads 172
2683 Processing and Economic Analysis of Rain Tree (Samanea saman) Pods for Village Level Hydrous Bioethanol Production

Authors: Dharell B. Siano, Wendy C. Mateo, Victorino T. Taylan, Francisco D. Cuaresma

Abstract:

Biofuel is one of the renewable energy sources adapted by the Philippine government in order to lessen the dependency on foreign fuel and to reduce carbon dioxide emissions. Rain tree pods were seen to be a promising source of bioethanol since it contains significant amount of fermentable sugars. The study was conducted to establish the complete procedure in processing rain tree pods for village level hydrous bioethanol production. Production processes were done for village level hydrous bioethanol production from collection, drying, storage, shredding, dilution, extraction, fermentation, and distillation. The feedstock was sundried, and moisture content was determined at a range of 20% to 26% prior to storage. Dilution ratio was 1:1.25 (1 kg of pods = 1.25 L of water) and after extraction process yielded a sugar concentration of 22 0Bx to 24 0Bx. The dilution period was three hours. After three hours of diluting the samples, the juice was extracted using extractor with a capacity of 64.10 L/hour. 150 L of rain tree pods juice was extracted and subjected to fermentation process using a village level anaerobic bioreactor. Fermentation with yeast (Saccharomyces cerevisiae) can fasten up the process, thus producing more ethanol at a shorter period of time; however, without yeast fermentation, it also produces ethanol at lower volume with slower fermentation process. Distillation of 150 L of fermented broth was done for six hours at 85 °C to 95 °C temperature (feedstock) and 74 °C to 95 °C temperature of the column head (vapor state of ethanol). The highest volume of ethanol recovered was established at with yeast fermentation at five-day duration with a value of 14.89 L and lowest actual ethanol content was found at without yeast fermentation at three-day duration having a value of 11.63 L. In general, the results suggested that rain tree pods had a very good potential as feedstock for bioethanol production. Fermentation of rain tree pods juice can be done with yeast and without yeast.

Keywords: fermentation, hydrous bioethanol, fermentation, rain tree pods, village level

Procedia PDF Downloads 268
2682 Bilingual Experience Influences Different Components of Cognitive Control: Evidence from fMRI Study

Authors: Xun Sun, Le Li, Ce Mo, Lei Mo, Ruiming Wang, Guosheng Ding

Abstract:

Cognitive control plays a central role in information processing, which is comprised of various components including response suppression and inhibitory control. Response suppression is considered to inhibit the irrelevant response during the cognitive process; while inhibitory control to inhibit the irrelevant stimulus in the process of cognition. Both of them undertake distinct functions for the cognitive control, so as to enhance the performances in behavior. Among numerous factors on cognitive control, bilingual experience is a substantial and indispensible factor. It has been reported that bilingual experience can influence the neural activity of cognitive control as whole. However, it still remains unknown how the neural influences specifically present on the components of cognitive control imposed by bilingualism. In order to explore the further issue, the study applied fMRI, used anti-saccade paradigm and compared the cerebral activations between high and low proficient Chinese-English bilinguals. Meanwhile, the study provided experimental evidence for the brain plasticity of language, and offered necessary bases on the interplay between language and cognitive control. The results showed that response suppression recruited the middle frontal gyrus (MFG) in low proficient Chinese-English bilinguals, but the inferior patrietal lobe in high proficient Chinese-English bilinguals. Inhibitory control engaged the superior temporal gyrus (STG) and middle temporal gyrus (MTG) in low proficient Chinese-English bilinguals, yet the right insula cortex was more active in high proficient Chinese-English bilinguals during the process. These findings illustrate insights that bilingual experience has neural influences on different components of cognitive control. Compared with low proficient bilinguals, high proficient bilinguals turn to activate advanced neural areas for the processing of cognitive control. In addition, with the acquisition and accumulation of language, language experience takes effect on the brain plasticity and changes the neural basis of cognitive control.

Keywords: bilingual experience, cognitive control, inhibition control, response suppression

Procedia PDF Downloads 468
2681 Fold and Thrust Belts Seismic Imaging and Interpretation

Authors: Sunjay

Abstract:

Plate tectonics is of very great significance as it represents the spatial relationships of volcanic rock suites at plate margins, the distribution in space and time of the conditions of different metamorphic facies, the scheme of deformation in mountain belts, or orogens, and the association of different types of economic deposit. Orogenic belts are characterized by extensive thrust faulting, movements along large strike-slip fault zones, and extensional deformation that occur deep within continental interiors. Within oceanic areas there also are regions of crustal extension and accretion in the backarc basins that are located on the landward sides of many destructive plate margins.Collisional orogens develop where a continent or island arc collides with a continental margin as a result of subduction. collisional and noncollisional orogens can be explained by differences in the strength and rheology of the continental lithosphere and by processes that influence these properties during orogenesis.Seismic Imaging Difficulties-In triangle zones, several factors reduce the effectiveness of seismic methods. The topography in the central part of the triangle zone is usually rugged and is associated with near-surface velocity inversions which degrade the quality of the seismic image. These characteristics lead to low signal-to-noise ratio, inadequate penetration of energy through overburden, poor geophone coupling with the surface and wave scattering. Depth Seismic Imaging Techniques-Seismic processing relates to the process of altering the seismic data to suppress noise, enhancing the desired signal (higher signal-to-noise ratio) and migrating seismic events to their appropriate location in space and depth. Processing steps generally include analysis of velocities, static corrections, moveout corrections, stacking and migration. Exploration seismology Bow-tie effect -Shadow Zones-areas with no reflections (dead areas). These are called shadow zones and are common in the vicinity of faults and other discontinuous areas in the subsurface. Shadow zones result when energy from a reflector is focused on receivers that produce other traces. As a result, reflectors are not shown in their true positions. Subsurface Discontinuities-Diffractions occur at discontinuities in the subsurface such as faults and velocity discontinuities (as at “bright spot” terminations). Bow-tie effect caused by the two deep-seated synclines. Seismic imaging of thrust faults and structural damage-deepwater thrust belts, Imaging deformation in submarine thrust belts using seismic attributes,Imaging thrust and fault zones using 3D seismic image processing techniques, Balanced structural cross sections seismic interpretation pitfalls checking, The seismic pitfalls can originate due to any or all of the limitations of data acquisition, processing, interpretation of the subsurface geology,Pitfalls and limitations in seismic attribute interpretation of tectonic features, Seismic attributes are routinely used to accelerate and quantify the interpretation of tectonic features in 3D seismic data. Coherence (or variance) cubes delineate the edges of megablocks and faulted strata, curvature delineates folds and flexures, while spectral components delineate lateral changes in thickness and lithology. Carbon capture and geological storage leakage surveillance because fault behave as a seal or a conduit for hydrocarbon transportation to a trap,etc.

Keywords: tectonics, seismic imaging, fold and thrust belts, seismic interpretation

Procedia PDF Downloads 52
2680 Artificial Intelligence Models for Detecting Spatiotemporal Crop Water Stress in Automating Irrigation Scheduling: A Review

Authors: Elham Koohi, Silvio Jose Gumiere, Hossein Bonakdari, Saeid Homayouni

Abstract:

Water used in agricultural crops can be managed by irrigation scheduling based on soil moisture levels and plant water stress thresholds. Automated irrigation scheduling limits crop physiological damage and yield reduction. Knowledge of crop water stress monitoring approaches can be effective in optimizing the use of agricultural water. Understanding the physiological mechanisms of crop responding and adapting to water deficit ensures sustainable agricultural management and food supply. This aim could be achieved by analyzing and diagnosing crop characteristics and their interlinkage with the surrounding environment. Assessments of plant functional types (e.g., leaf area and structure, tree height, rate of evapotranspiration, rate of photosynthesis), controlling changes, and irrigated areas mapping. Calculating thresholds of soil water content parameters, crop water use efficiency, and Nitrogen status make irrigation scheduling decisions more accurate by preventing water limitations between irrigations. Combining Remote Sensing (RS), the Internet of Things (IoT), Artificial Intelligence (AI), and Machine Learning Algorithms (MLAs) can improve measurement accuracies and automate irrigation scheduling. This paper is a review structured by surveying about 100 recent research studies to analyze varied approaches in terms of providing high spatial and temporal resolution mapping, sensor-based Variable Rate Application (VRA) mapping, the relation between spectral and thermal reflectance and different features of crop and soil. The other objective is to assess RS indices formed by choosing specific reflectance bands and identifying the correct spectral band to optimize classification techniques and analyze Proximal Optical Sensors (POSs) to control changes. The innovation of this paper can be defined as categorizing evaluation methodologies of precision irrigation (applying the right practice, at the right place, at the right time, with the right quantity) controlled by soil moisture levels and sensitiveness of crops to water stress, into pre-processing, processing (retrieval algorithms), and post-processing parts. Then, the main idea of this research is to analyze the error reasons and/or values in employing different approaches in three proposed parts reported by recent studies. Additionally, as an overview conclusion tried to decompose different approaches to optimizing indices, calibration methods for the sensors, thresholding and prediction models prone to errors, and improvements in classification accuracy for mapping changes.

Keywords: agricultural crops, crop water stress detection, irrigation scheduling, precision agriculture, remote sensing

Procedia PDF Downloads 54
2679 An Improved Two-dimensional Ordered Statistical Constant False Alarm Detection

Authors: Weihao Wang, Zhulin Zong

Abstract:

Two-dimensional ordered statistical constant false alarm detection is a widely used method for detecting weak target signals in radar signal processing applications. The method is based on analyzing the statistical characteristics of the noise and clutter present in the radar signal and then using this information to set an appropriate detection threshold. In this approach, the reference cell of the unit to be detected is divided into several reference subunits. These subunits are used to estimate the noise level and adjust the detection threshold, with the aim of minimizing the false alarm rate. By using an ordered statistical approach, the method is able to effectively suppress the influence of clutter and noise, resulting in a low false alarm rate. The detection process involves a number of steps, including filtering the input radar signal to remove any noise or clutter, estimating the noise level based on the statistical characteristics of the reference subunits, and finally, setting the detection threshold based on the estimated noise level. One of the main advantages of two-dimensional ordered statistical constant false alarm detection is its ability to detect weak target signals in the presence of strong clutter and noise. This is achieved by carefully analyzing the statistical properties of the signal and using an ordered statistical approach to estimate the noise level and adjust the detection threshold. In conclusion, two-dimensional ordered statistical constant false alarm detection is a powerful technique for detecting weak target signals in radar signal processing applications. By dividing the reference cell into several subunits and using an ordered statistical approach to estimate the noise level and adjust the detection threshold, this method is able to effectively suppress the influence of clutter and noise and maintain a low false alarm rate.

Keywords: two-dimensional, ordered statistical, constant false alarm, detection, weak target signals

Procedia PDF Downloads 58
2678 A Comparative Study of European Terrazzo and Tibetan Arga Floor Making Techniques

Authors: Hubert Feiglstorfer

Abstract:

The technique of making terrazzo has been known since ancient times. During the Roman Empire, known as opus signinum, at the time of the Renaissance, known as composto terrazzo marmorino or at the turn of the 19th and 20th centuries, the use of terrazzo experienced a common use in Europe. In Asia, especially in the Himalayas and the Tibetan highlands, a particular floor and roof manufacturing technique is commonly used for about 1500 years, known as arga. The research question in this contribution asks for technical and cultural-historical synergies of these floor-making techniques. The making process of an arga floor shows constructive parallels to the European terrazzo. Surface processing by grinding, burnishing and sealing, in particular, reveals technological similarities. The floor structure itself, on the other hand, shows differences, for example in the use of hydraulic aggregate in the terrazzo, while the arga floor is used without hydraulic material, but the result of both techniques is a tight, water-repellent and shiny surface. As part of this comparative study, the materials, processing techniques and quality features of the two techniques are compared and parallels and differences are analysed. In addition to text and archive research, the methods used are results of material analyses and ethnographic research such as participant observation. Major findings of the study are the investigation of the mineralogical composition of arga floors and its comparison with terrazzo floors. The study of the cultural-historical context in which both techniques are embedded will give insight into technical developments in Europe and Asia, parallels and differences. Synergies from this comparison let possible technological developments in the production, conservation and renovation of European terrazzo floors appear in a new light. By making arga floors without cement-based aggregates, the renovation of historical floors from purely natural products and without using energy by means of a burning process can be considered.

Keywords: European and Asian crafts, material culture, floor making technology, terrazzo, arga, Tibetan building traditions

Procedia PDF Downloads 209
2677 Assessment of Image Databases Used for Human Skin Detection Methods

Authors: Saleh Alshehri

Abstract:

Human skin detection is a vital step in many applications. Some of the applications are critical especially those related to security. This leverages the importance of a high-performance detection algorithm. To validate the accuracy of the algorithm, image databases are usually used. However, the suitability of these image databases is still questionable. It is suggested that the suitability can be measured mainly by the span the database covers of the color space. This research investigates the validity of three famous image databases.

Keywords: image databases, image processing, pattern recognition, neural networks

Procedia PDF Downloads 244
2676 Classification of ECG Signal Based on Mixture of Linear and Non-Linear Features

Authors: Mohammad Karimi Moridani, Mohammad Abdi Zadeh, Zahra Shahiazar Mazraeh

Abstract:

In recent years, the use of intelligent systems in biomedical engineering has increased dramatically, especially in the diagnosis of various diseases. Also, due to the relatively simple recording of the electrocardiogram signal (ECG), this signal is a good tool to show the function of the heart and diseases associated with it. The aim of this paper is to design an intelligent system for automatically detecting a normal electrocardiogram signal from abnormal one. Using this diagnostic system, it is possible to identify a person's heart condition in a very short time and with high accuracy. The data used in this article are from the Physionet database, available in 2016 for use by researchers to provide the best method for detecting normal signals from abnormalities. Data is of both genders and the data recording time varies between several seconds to several minutes. All data is also labeled normal or abnormal. Due to the low positional accuracy and ECG signal time limit and the similarity of the signal in some diseases with the normal signal, the heart rate variability (HRV) signal was used. Measuring and analyzing the heart rate variability with time to evaluate the activity of the heart and differentiating different types of heart failure from one another is of interest to the experts. In the preprocessing stage, after noise cancelation by the adaptive Kalman filter and extracting the R wave by the Pan and Tampkinz algorithm, R-R intervals were extracted and the HRV signal was generated. In the process of processing this paper, a new idea was presented that, in addition to using the statistical characteristics of the signal to create a return map and extraction of nonlinear characteristics of the HRV signal due to the nonlinear nature of the signal. Finally, the artificial neural networks widely used in the field of ECG signal processing as well as distinctive features were used to classify the normal signals from abnormal ones. To evaluate the efficiency of proposed classifiers in this paper, the area under curve ROC was used. The results of the simulation in the MATLAB environment showed that the AUC of the MLP and SVM neural network was 0.893 and 0.947, respectively. As well as, the results of the proposed algorithm in this paper indicated that the more use of nonlinear characteristics in normal signal classification of the patient showed better performance. Today, research is aimed at quantitatively analyzing the linear and non-linear or descriptive and random nature of the heart rate variability signal, because it has been shown that the amount of these properties can be used to indicate the health status of the individual's heart. The study of nonlinear behavior and dynamics of the heart's neural control system in the short and long-term provides new information on how the cardiovascular system functions, and has led to the development of research in this field. Given that the ECG signal contains important information and is one of the common tools used by physicians to diagnose heart disease, but due to the limited accuracy of time and the fact that some information about this signal is hidden from the viewpoint of physicians, the design of the intelligent system proposed in this paper can help physicians with greater speed and accuracy in the diagnosis of normal and patient individuals and can be used as a complementary system in the treatment centers.

Keywords: neart rate variability, signal processing, linear and non-linear features, classification methods, ROC Curve

Procedia PDF Downloads 239
2675 Reduction of Fermentation Duration of Cassava to Remove Hydrogen Cyanide

Authors: Jean Paul Hategekimana, Josiane Irakoze, Eugene Niyonzima, Annick Ndekezi

Abstract:

Cassava (Manihot esculenta Crantz) is a root crop comprising an anti-nutritive factor known as cyanide. The compound can be removed by numerous processing methods such as boiling, fermentation, blanching, and sun drying to avoid the possibility of cyanide poisoning. Inappropriate processing mean can lead to disease and death. Cassava-based dishes are consumed in different ways, where cassava is cultivated according to their culture and preference. However, they have been shown to be unsafe based on high cyanide levels. The current study targeted to resolve the problem of high cyanide in cassava consumed in Rwanda. This study was conducted to determine the effect of slicing, blanching, and soaking time to reduce the fermentation duration of cassava for hydrogen cyanide (HCN) in mg/g removal. Cassava was sliced into three different portions (1cm, 2cm, and 5cm). The first portions were naturally fermented for seven days, where each portion was removed every 24 hours from soaking tanks and then oven dried at a temperature of 60°C and then milled to obtain naturally fermented cassava flours. Other portions of 1cm, 2cm, and 5cm were blanched for 2, 5, 10 min, respectively, and each similarly dried at 60°C and milled to produce blanched cassava flour. Other blanched portions were used to follow the previous fermentation steps. The last portions, which formed the control, were simply chopped. Cyanide content and starch content in mg/100g were investigated. According to the conducted analysis on different cassava treatments for detoxification, found that usual fermentation can be used, but for sliced portions aimed to size reduction for the easy hydrogen cyanide diffuse out and it takes four days to complete fermentation, which has reduced at 94.44% with significantly different (p<0.05)of total hydrogen cyanide contained in cassava to safe level of consumption, and what is recommended as more effective is to apply blanching combined with fermentation due to the fact that, it takes three days to complete hydrogen cyanide removal at 95.56% on significantly different (p<0.05) of reduction to the safe level of consumption.

Keywords: cassava, cyanide, blanching, drying, fermentation

Procedia PDF Downloads 38
2674 The Role of Artificial Intelligence Algorithms in Psychiatry: Advancing Diagnosis and Treatment

Authors: Netanel Stern

Abstract:

Artificial intelligence (AI) algorithms have emerged as powerful tools in the field of psychiatry, offering new possibilities for enhancing diagnosis and treatment outcomes. This article explores the utilization of AI algorithms in psychiatry, highlighting their potential to revolutionize patient care. Various AI algorithms, including machine learning, natural language processing (NLP), reinforcement learning, clustering, and Bayesian networks, are discussed in detail. Moreover, ethical considerations and future directions for research and implementation are addressed.

Keywords: AI, software engineering, psychiatry, neuroimaging

Procedia PDF Downloads 83
2673 High-Frequency Acoustic Microscopy Imaging of Pellet/Cladding Interface in Nuclear Fuel Rods

Authors: H. Saikouk, D. Laux, Emmanuel Le Clézio, B. Lacroix, K. Audic, R. Largenton, E. Federici, G. Despaux

Abstract:

Pressurized Water Reactor (PWR) fuel rods are made of ceramic pellets (e.g. UO2 or (U,Pu) O2) assembled in a zirconium cladding tube. By design, an initial gap exists between these two elements. During irradiation, they both undergo transformations leading progressively to the closure of this gap. A local and non destructive examination of the pellet/cladding interface could constitute a useful help to identify the zones where the two materials are in contact, particularly at high burnups when a strong chemical bonding occurs under nominal operating conditions in PWR fuel rods. The evolution of the pellet/cladding bonding during irradiation is also an area of interest. In this context, the Institute of Electronic and Systems (IES- UMR CNRS 5214), in collaboration with the Alternative Energies and Atomic Energy Commission (CEA), is developing a high frequency acoustic microscope adapted to the control and imaging of the pellet/cladding interface with high resolution. Because the geometrical, chemical and mechanical nature of the contact interface is neither axially nor radially homogeneous, 2D images of this interface need to be acquired via this ultrasonic system with a highly performing processing signal and by means of controlled displacement of the sample rod along both its axis and its circumference. Modeling the multi-layer system (water, cladding, fuel etc.) is necessary in this present study and aims to take into account all the parameters that have an influence on the resolution of the acquired images. The first prototype of this microscope and the first results of the visualization of the inner face of the cladding will be presented in a poster in order to highlight the potentials of the system, whose final objective is to be introduced in the existing bench MEGAFOX dedicated to the non-destructive examination of irradiated fuel rods at LECA-STAR facility in CEA-Cadarache.

Keywords: high-frequency acoustic microscopy, multi-layer model, non-destructive testing, nuclear fuel rod, pellet/cladding interface, signal processing

Procedia PDF Downloads 171
2672 Geological Mapping of Gabel Humr Akarim Area, Southern Eastern Desert, Egypt: Constrain from Remote Sensing Data, Petrographic Description and Field Investigation

Authors: Doaa Hamdi, Ahmed Hashem

Abstract:

The present study aims at integrating the ASTER data and Landsat 8 data to discriminate and map alteration and/or mineralization zones in addition to delineating different lithological units of Humr Akarim Granites area. The study area is located at 24º9' to 24º13' N and 34º1' to 34º2'45"E., covering a total exposed surface area of about 17 km². The area is characterized by rugged topography with low to moderate relief. Geologic fieldwork and petrographic investigations revealed that the basement complex of the study area is composed of metasediments, mafic dikes, older granitoids, and alkali-feldspar granites. Petrographic investigations revealed that the secondary minerals in the study area are mainly represented by chlorite, epidote, clay minerals and iron oxides. These minerals have specific spectral signatures in the region of visible near-infrared and short-wave infrared (0.4 to 2.5 µm). So that the ASTER imagery processing was concentrated on VNIR-SWIR spectrometric data in order to achieve the purposes of this study (geologic mapping of hydrothermal alteration zones and delineate possible radioactive potentialities). Mapping of hydrothermal alterations zones in addition to discriminating the lithological units in the study area are achieved through the utilization of some different image processing, including color band composites (CBC) and data transformation techniques such as band ratios (BR), band ratio codes (BRCs), principal component analysis(PCA), Crosta Technique and minimum noise fraction (MNF). The field verification and petrographic investigation confirm the results of ASTER imagery and Landsat 8 data, proposing a geological map (scale 1:50000).

Keywords: remote sensing, petrography, mineralization, alteration detection

Procedia PDF Downloads 141
2671 High Performance Computing and Big Data Analytics

Authors: Branci Sarra, Branci Saadia

Abstract:

Because of the multiplied data growth, many computer science tools have been developed to process and analyze these Big Data. High-performance computing architectures have been designed to meet the treatment needs of Big Data (view transaction processing standpoint, strategic, and tactical analytics). The purpose of this article is to provide a historical and global perspective on the recent trend of high-performance computing architectures especially what has a relation with Analytics and Data Mining.

Keywords: high performance computing, HPC, big data, data analysis

Procedia PDF Downloads 498
2670 Image Based Landing Solutions for Large Passenger Aircraft

Authors: Thierry Sammour Sawaya, Heikki Deschacht

Abstract:

In commercial aircraft operations, almost half of the accidents happen during approach or landing phases. Automatic guidance and automatic landings have proven to bring significant safety value added for this challenging landing phase. This is why Airbus and ScioTeq have decided to work together to explore the capability of image-based landing solutions as additional landing aids to further expand the possibility to perform automatic approach and landing to runways where the current guiding systems are either not fitted or not optimum. Current systems for automated landing often depend on radio signals provided by airport ground infrastructure on the airport or satellite coverage. In addition, these radio signals may not always be available with the integrity and performance required for safe automatic landing. Being independent from these radio signals would widen the operations possibilities and increase the number of automated landings. Airbus and ScioTeq are joining their expertise in the field of Computer Vision in the European Program called Clean Sky 2 Large Passenger Aircraft, in which they are leading the IMBALS (IMage BAsed Landing Solutions) project. The ultimate goal of this project is to demonstrate, develop, validate and verify a certifiable automatic landing system guiding an airplane during the approach and landing phases based on an onboard camera system capturing images, enabling automatic landing independent from radio signals and without precision instrument for landing. In the frame of this project, ScioTeq is responsible for the development of the Image Processing Platform (IPP), while Airbus is responsible for defining the functional and system requirements as well as the testing and integration of the developed equipment in a Large Passenger Aircraft representative environment. The aim of this paper will be to describe the system as well as the associated methods and tools developed for validation and verification.

Keywords: aircraft landing system, aircraft safety, autoland, avionic system, computer vision, image processing

Procedia PDF Downloads 75
2669 Optimal MRO Process Scheduling with Rotable Inventory to Minimize Total Earliness

Authors: Murat Erkoc, Kadir Ertogral

Abstract:

Maintenance, repair and overhauling (MRO) of high cost equipment used in many industries such as transportation, military and construction are typically subject to regulations set by local governments or international agencies. Aircrafts are prime examples for this kind of equipment. Such equipment must be overhauled at certain intervals for continuing permission of use. As such, the overhaul must be completed by strict deadlines, which often times cannot be exceeded. Due to the fact that the overhaul is typically a long process, MRO companies carry so called rotable inventory for exchange of expensive modules in the overhaul process of the equipment so that the equipment continue its services with minimal interruption. The extracted module is overhauled and returned back to the inventory for future exchange, hence the name rotable inventory. However, since the rotable inventory and overhaul capacity are limited, it may be necessary to carry out some of the exchanges earlier than their deadlines in order to produce a feasible overhaul schedule. An early exchange results with a decrease in the equipment’s cycle time in between overhauls and as such, is not desired by the equipment operators. This study introduces an integer programming model for the optimal overhaul and exchange scheduling. We assume that there is certain number of rotables at hand at the beginning of the planning horizon for a single type module and there are multiple demands with known deadlines for the exchange of the modules. We consider an MRO system with identical parallel processing lines. The model minimizes total earliness by generating optimal overhaul start times for rotables on parallel processing lines and exchange timetables for orders. We develop a fast exact solution algorithm for the model. The algorithm employs full-delay scheduling approach with backward allocation and can easily be used for overhaul scheduling problems in various MRO settings with modular rotable items. The proposed procedure is demonstrated by a case study from the aerospace industry.

Keywords: rotable inventory, full-delay scheduling, maintenance, overhaul, total earliness

Procedia PDF Downloads 524
2668 Development of a Computer Aided Diagnosis Tool for Brain Tumor Extraction and Classification

Authors: Fathi Kallel, Abdulelah Alabd Uljabbar, Abdulrahman Aldukhail, Abdulaziz Alomran

Abstract:

The brain is an important organ in our body since it is responsible about the majority actions such as vision, memory, etc. However, different diseases such as Alzheimer and tumors could affect the brain and conduct to a partial or full disorder. Regular diagnosis are necessary as a preventive measure and could help doctors to early detect a possible trouble and therefore taking the appropriate treatment, especially in the case of brain tumors. Different imaging modalities are proposed for diagnosis of brain tumor. The powerful and most used modality is the Magnetic Resonance Imaging (MRI). MRI images are analyzed by doctor in order to locate eventual tumor in the brain and describe the appropriate and needed treatment. Diverse image processing methods are also proposed for helping doctors in identifying and analyzing the tumor. In fact, a large Computer Aided Diagnostic (CAD) tools including developed image processing algorithms are proposed and exploited by doctors as a second opinion to analyze and identify the brain tumors. In this paper, we proposed a new advanced CAD for brain tumor identification, classification and feature extraction. Our proposed CAD includes three main parts. Firstly, we load the brain MRI. Secondly, a robust technique for brain tumor extraction is proposed. This technique is based on both Discrete Wavelet Transform (DWT) and Principal Component Analysis (PCA). DWT is characterized by its multiresolution analytic property, that’s why it was applied on MRI images with different decomposition levels for feature extraction. Nevertheless, this technique suffers from a main drawback since it necessitates a huge storage and is computationally expensive. To decrease the dimensions of the feature vector and the computing time, PCA technique is considered. In the last stage, according to different extracted features, the brain tumor is classified into either benign or malignant tumor using Support Vector Machine (SVM) algorithm. A CAD tool for brain tumor detection and classification, including all above-mentioned stages, is designed and developed using MATLAB guide user interface.

Keywords: MRI, brain tumor, CAD, feature extraction, DWT, PCA, classification, SVM

Procedia PDF Downloads 229
2667 Strategies of Risk Management for Smallholder Farmers in South Africa: A Case Study on Pigeonpea (Cajanus cajan) Production

Authors: Sanari Chalin Moriri, Kwabena Kingsley Ayisi, Alina Mofokeng

Abstract:

Dryland smallholder farmers in South Africa are vulnerable to all kinds of risks, and it negatively affects crop productivity and profit. Pigeonpea is a leguminous and multipurpose crop that provides food, fodder, and wood for smallholder farmers. The majority of these farmers are still growing pigeonpea from traditional unimproved seeds, which comprise a mixture of genotypes. The objectives of the study were to identify the key risk factors that affect pigeonpea productivity and to develop management strategies on how to alleviate the risk factors in pigeonpea production. The study was conducted in two provinces (Limpopo and Mpumalanga) of South Africa in six municipalities during the 2020/2021 growing seasons. The non-probability sampling method using purposive and snowball sampling techniques were used to collect data from the farmers through a structured questionnaire. A total of 114 pigeonpea producers were interviewed individually using a questionnaire. Key stakeholders in each municipality were also identified, invited, and interviewed to verify the information given by farmers. Data collected were subjected to SPSS statistical software 25 version. The findings of the study were that majority of farmers affected by risk factors were women, subsistence, and old farmers resulted in low food production. Drought, unavailability of improved pigeonpea seeds for planting, access to information, and processing equipment were found to be the main risk factors contributing to low crop productivity in farmer’s fields. Above 80% of farmers lack knowledge on the improvement of the crop and also on the processing techniques to secure high prices during the crop off-season. Market availability, pricing, and incidence of pests and diseases were found to be minor risk factors which were triggered by the major risk factors. The minor risk factors can be corrected only if the major risk factors are first given the necessary attention. About 10% of the farmers found to use the crop as a mulch to reduce soil temperatures and to improve soil fertility. The study revealed that most of the farmers were unaware of its utilisation as fodder, much, medicinal, nitrogen fixation, and many more. The risk of frequent drought in dry areas of South Africa where farmers solely depend on rainfall poses a serious threat to crop productivity. The majority of these risk factors are caused by climate change due to unrealistic, low rainfall with extreme temperatures poses a threat to food security, water, and the environment. The use of drought-tolerant, multipurpose legume crops such as pigeonpea, access to new information, provision of processing equipment, and support from all stakeholders will help in addressing food security for smallholder farmers. Policies should be revisited to address the prevailing risk factors faced by farmers and involve them in addressing the risk factors. Awareness should be prioritized in promoting the crop to improve its production and commercialization in the dryland farming system of South Africa.

Keywords: management strategies, pigeonpea, risk factors, smallholder farmers

Procedia PDF Downloads 193
2666 Subdued Electrodermal Response to Empathic Induction Task in Intimate Partner Violence (IPV) Perpetrators

Authors: Javier Comes Fayos, Isabel Rodríguez Moreno, Sara Bressanutti, Marisol Lila, Angel Romero Martínez, Luis Moya Albiol

Abstract:

Empathy is a cognitive-affective capacity whose deterioration is associated with aggressive behaviour. Deficient affective processing is one of the predominant risk factors in men convicted of intimate partner violence (IPV perpetrators), since it makes their capacity to empathize very difficult. The objective of this study is to compare the response of electrodermal activity (EDA), as an indicator of emotionality, to an empathic induction task, between IPV perpetrators and men without a history of violence. The sample was composed of 51 men who attended the CONTEXTO program, with penalties for gender violence under two years, and 47 men with no history of violence. Empathic induction was achieved through the visualization of 4 negative emotional-eliciting videos taken from an emotional induction battery of videos validated for the Spanish population. The participants were asked to actively empathize with the video characters (previously pointed out). The psychophysiological recording of the EDA was accomplished by the "Vrije Universiteit Ambulatory Monitoring System (VU-AMS)." An analysis of repeated measurements was carried out with 10 intra-subject measurements (time) and "group" (IPV perpetrators and non-violent perpetrators) as the inter-subject factor. First, there were no significant differences between groups in the baseline AED levels. Yet, a significant interaction between the “time” and “group” was found with IPV perpetrators exhibiting lower EDA response than controls after the empathic induction task. These findings provide evidence of a subdued EDA response after an empathic induction task in IPV perpetrators with respect to men without a history of violence. Therefore, the lower psychophysiological activation would be indicative of difficulties in the emotional processing and response, functions that are necessary for the empathic function. Consequently, the importance of addressing possible empathic difficulties in IPV perpetrator psycho-educational programs is reinforced, putting special emphasis on the affective dimension that could hinder the empathic function.

Keywords: electrodermal activity, emotional induction, empathy, intimate partner violence

Procedia PDF Downloads 176
2665 Time Temperature Dependence of Long Fiber Reinforced Polypropylene Manufactured by Direct Long Fiber Thermoplastic Process

Authors: K. A. Weidenmann, M. Grigo, B. Brylka, P. Elsner, T. Böhlke

Abstract:

In order to reduce fuel consumption, the weight of automobiles has to be reduced. Fiber reinforced polymers offer the potential to reach this aim because of their high stiffness to weight ratio. Additionally, the use of fiber reinforced polymers in automotive applications has to allow for an economic large-scale production. In this regard, long fiber reinforced thermoplastics made by direct processing offer both mechanical performance and processability in injection moulding and compression moulding. The work presented in this contribution deals with long glass fiber reinforced polypropylene directly processed in compression moulding (D-LFT). For the use in automotive applications both the temperature and the time dependency of the materials properties have to be investigated to fulfill performance requirements during crash or the demands of service temperatures ranging from -40 °C to 80 °C. To consider both the influence of temperature and time, quasistatic tensile tests have been carried out at different temperatures. These tests have been complemented by high speed tensile tests at different strain rates. As expected, the increase in strain rate results in an increase of the elastic modulus which correlates to an increase of the stiffness with decreasing service temperature. The results are in good accordance with results determined by dynamic mechanical analysis within the range of 0.1 to 100 Hz. The experimental results from different testing methods were grouped and interpreted by using different time temperature shift approaches. In this regard, Williams-Landel-Ferry and Arrhenius approach based on kinetics have been used. As the theoretical shift factor follows an arctan function, an empirical approach was also taken into consideration. It could be shown that this approach describes best the time and temperature superposition for glass fiber reinforced polypropylene manufactured by D-LFT processing.

Keywords: composite, dynamic mechanical analysis, long fibre reinforced thermoplastics, mechanical properties, time temperature superposition

Procedia PDF Downloads 181
2664 The Impact of a Prior Haemophilus influenzae Infection in the Incidence of Prostate Cancer

Authors: Maximiliano Guerra, Lexi Frankel, Amalia D. Ardeljan, Sarah Ghali, Diya Kohli, Omar M. Rashid.

Abstract:

Introduction/Background: Haemophilus influenzae is present as a commensal organism in the nasopharynx of most healthy adults from where it can spread to cause both systemic and respiratory tract infection. Pathogenic properties of this bacterium as well as defects in host defense may result in the spread of these bacteria throughout the body. This can result in a proinflammatory state and colonization particularly in the lungs. Recent studies have failed to determine a link between H. Influenzae colonization and prostate cancer, despite previous research demonstrating the presence of proinflammatory states in preneoplastic and neoplastic prostate lesions. Given these contradictory findings, the primary goal of this study was to evaluate the correlation between H. Influenzae infection and the incidence of prostate cancer. Methods: To evaluate the incidence of Haemophilus influenzae infection and the development of prostate cancer in the future we used data provided by a Health Insurance Portability and Accountability Act (HIPAA) compliant national database. We were afforded access to this database by Holy Cross Health, Fort Lauderdale for the express purpose of academic research. Standard statistical methods were employed in this study including Pearson’s chi-square tests. Results: Between January 2010 and December 2019, the query was analyzed and resulted in 13, 691 patients in both the control and C. difficile infected groups, respectively. The two groups were matched by age range and CCI score. In the Haemophilus influenzae infected group, the incidence of prostate cancer was 1.46%, while the incidence of the prostate cancer control group was 4.56%. The observed difference in cancer incidence was determined to be a statistically significant p-value (< 2.2x10^-16). This suggests that patients with a history of C. difficile have less risk of developing prostate cancer (OR 0.425, 95% CI: 0.382 - 0.472). Treatment bias was considered, the data was analyzed and resulted in two groups matched groups of 3,208 patients in both the infected with H. Influenzae treated group and the control who used the same medications for a different cause. Patients infected with H. Influenzae and treated had an incidence of prostate cancer of 2.49% whereas the control group incidence of prostate cancer was 4.92% with a p-value (< 2.2x10^-16) OR 0.455 CI 95% (0.526 -0.754), proving that the initial results were not due to the use of medications. Conclusion: The findings of our study reveal a statistically significant correlation between H. Influenzae infection and a decreased incidence of prostate cancer. Our findings suggest that prior infection with H. Influenzae may confer some degree of protection to patients and reduce their risk for developing prostate cancer. Future research is recommended to further characterize the potential role of Haemophilus influenzae in the pathogenesis of prostate cancer.

Keywords: Haemophilus Influenzae, incidence, prostate cancer, risk.

Procedia PDF Downloads 183
2663 A CORDIC Based Design Technique for Efficient Computation of DCT

Authors: Deboraj Muchahary, Amlan Deep Borah Abir J. Mondal, Alak Majumder

Abstract:

A discrete cosine transform (DCT) is described and a technique to compute it using fast Fourier transform (FFT) is developed. In this work, DCT of a finite length sequence is obtained by incorporating CORDIC methodology in radix-2 FFT algorithm. The proposed methodology is simple to comprehend and maintains a regular structure, thereby reducing computational complexity. DCTs are used extensively in the area of digital processing for the purpose of pattern recognition. So the efficient computation of DCT maintaining a transparent design flow is highly solicited.

Keywords: DCT, DFT, CORDIC, FFT

Procedia PDF Downloads 454
2662 Operating Parameters and Costs Assessments of a Real Fishery Wastewater Effluent Treated by Electrocoagulation Process

Authors: Mirian Graciella Dalla Porta, Humberto Jorge José, Danielle de Bem Luiz, Regina de F. P. M.Moreira

Abstract:

Similar to most processing industries, fish processing produces large volumes of wastewater, which contains especially organic contaminants, salts and oils dispersed therein. Different processes have been used for the treatment of fishery wastewaters, but the most commonly used are chemical coagulation and flotation. These techniques are well known but sometimes the characteristics of the treated effluent do not comply with legal standards for discharge. Electrocoagulation (EC) is an electrochemical process that can be used to treat wastewaters in terms of both organic matter and nutrient removal. The process is based on the use of sacrificial electrodes such as aluminum, iron or zinc, that are oxidized to produce metal ions that can be used to coagulate and react with organic matter and nutrients in the wastewater. While EC processes are effective to treatment of several types of wastewaters, applications have been limited due to the high energy demands and high current densities. Generally, the for EC process can be performed without additional chemicals or pre-treatment, but the costs should be reduced for EC processes to become more applicable. In this work, we studied the treatment of a real wastewater from fishmeal industry by electrocoagulation process. Removal efficiencies for chemical oxygen demand (COD), total organic carbon (TOC) turbidity, phosphorous and nitrogen concentration were determined as a function of the operating conditions, such as pH, current density and operating time. The optimum operating conditions were determined to be operating time of 10 minutes, current density 100 A.m-2, and initial pH 4.0. COD, TOC, phosphorous concentration, and turbidity removal efficiencies at the optimum operating conditions were higher than 90% for aluminum electrode. Operating costs at the optimum conditions were calculated as US$ 0.37/m3 (US$ 0.038/kg COD) for Al electrode. These results demonstrate that the EC process is a promising technology to remove nutrients from fishery wastewaters, as the process has both a high efficiency of nutrient removal, and low energy requirements.

Keywords: electrocoagulation, fish, food industry, wastewater

Procedia PDF Downloads 225