Search results for: Processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3582

Search results for: Processing

1152 Comparison of Central Light Reflex Width-to-Retinal Vessel Diameter Ratio between Glaucoma and Normal Eyes by Using Edge Detection Technique

Authors: P. Siriarchawatana, K. Leungchavaphongse, N. Covavisaruch, K. Rojananuangnit, P. Boondaeng, N. Panyayingyong

Abstract:

Glaucoma is a disease that causes visual loss in adults. Glaucoma causes damage to the optic nerve and its overall pathophysiology is still not fully understood. Vasculopathy may be one of the possible causes of nerve damage. Photographic imaging of retinal vessels by fundus camera during eye examination may complement clinical management. This paper presents an innovation for measuring central light reflex width-to-retinal vessel diameter ratio (CRR) from digital retinal photographs. Using our edge detection technique, CRRs from glaucoma and normal eyes were compared to examine differences and associations. CRRs were evaluated on fundus photographs of participants from Mettapracharak (Wat Raikhing) Hospital in Nakhon Pathom, Thailand. Fifty-five photographs from normal eyes and twenty-one photographs from glaucoma eyes were included. Participants with hypertension were excluded. In each photograph, CRRs from four retinal vessels, including arteries and veins in the inferotemporal and superotemporal regions, were quantified using edge detection technique. From our finding, mean CRRs of all four retinal arteries and veins were significantly higher in persons with glaucoma than in those without glaucoma (0.34 vs. 0.32, p < 0.05 for inferotemporal vein, 0.33 vs. 0.30, p < 0.01 for inferotemporal artery, 0.34 vs. 0.31, p < 0.01 for superotemporal vein, and 0.33 vs. 0.30, p < 0.05 for superotemporal artery). From these results, an increase in CRRs of retinal vessels, as quantitatively measured from fundus photographs, could be associated with glaucoma.

Keywords: glaucoma, retinal vessel, central light reflex, image processing, fundus photograph, edge detection

Procedia PDF Downloads 305
1151 Effect of Cellular Water Transport on Deformation of Food Material during Drying

Authors: M. Imran Hossen Khan, M. Mahiuddin, M. A. Karim

Abstract:

Drying is a food processing technique where simultaneous heat and mass transfer take place from surface to the center of the sample. Deformation of food materials during drying is a common physical phenomenon which affects the textural quality and taste of the dried product. Most of the plant-based food materials are porous and hygroscopic in nature that contains about 80-90% water in different cellular environments: intercellular environment and intracellular environment. Transport of this cellular water has a significant effect on material deformation during drying. However, understanding of the scale of deformation is very complex due to diverse nature and structural heterogeneity of food material. Knowledge about the effect of transport of cellular water on deformation of material during drying is crucial for increasing the energy efficiency and obtaining better quality dried foods. Therefore, the primary aim of this work is to investigate the effect of intracellular water transport on material deformation during drying. In this study, apple tissue was taken for the investigation. The experiment was carried out using 1H-NMR T2 relaxometry with a conventional dryer. The experimental results are consistent with the understanding that transport of intracellular water causes cellular shrinkage associated with the anisotropic deformation of whole apple tissue. Interestingly, it is found that the deformation of apple tissue takes place at different stages of drying rather than deforming at one time. Moreover, it is found that the penetration rate of heat energy together with the pressure gradient between intracellular and intercellular environments is the responsible force to rupture the cell membrane.

Keywords: heat and mass transfer, food material, intracellular water, cell rupture, deformation

Procedia PDF Downloads 202
1150 Linux Security Management: Research and Discussion on Problems Caused by Different Aspects

Authors: Ma Yuzhe, Burra Venkata Durga Kumar

Abstract:

The computer is a great invention. As people use computers more and more frequently, the demand for PCs is growing, and the performance of computer hardware is also rising to face more complex processing and operation. However, the operating system, which provides the soul for computers, has stopped developing at a stage. In the face of the high price of UNIX (Uniplexed Information and Computering System), batch after batch of personal computer owners can only give up. Disk Operating System is too simple and difficult to bring innovation into play, which is not a good choice. And MacOS is a special operating system for Apple computers, and it can not be widely used on personal computers. In this environment, Linux, based on the UNIX system, was born. Linux combines the advantages of the operating system and is composed of many microkernels, which is relatively powerful in the core architecture. Linux system supports all Internet protocols, so it has very good network functions. Linux supports multiple users. Each user has no influence on their own files. Linux can also multitask and run different programs independently at the same time. Linux is a completely open source operating system. Users can obtain and modify the source code for free. Because of these advantages of Linux, it has also attracted a large number of users and programmers. The Linux system is also constantly upgraded and improved. It has also issued many different versions, which are suitable for community use and commercial use. Linux system has good security because it relies on a file partition system. However, due to the constant updating of vulnerabilities and hazards, the using security of the operating system also needs to be paid more attention to. This article will focus on the analysis and discussion of Linux security issues.

Keywords: Linux, operating system, system management, security

Procedia PDF Downloads 90
1149 Classifier for Liver Ultrasound Images

Authors: Soumya Sajjan

Abstract:

Liver cancer is the most common cancer disease worldwide in men and women, and is one of the few cancers still on the rise. Liver disease is the 4th leading cause of death. According to new NHS (National Health Service) figures, deaths from liver diseases have reached record levels, rising by 25% in less than a decade; heavy drinking, obesity, and hepatitis are believed to be behind the rise. In this study, we focus on Development of Diagnostic Classifier for Ultrasound liver lesion. Ultrasound (US) Sonography is an easy-to-use and widely popular imaging modality because of its ability to visualize many human soft tissues/organs without any harmful effect. This paper will provide an overview of underlying concepts, along with algorithms for processing of liver ultrasound images Naturaly, Ultrasound liver lesion images are having more spackle noise. Developing classifier for ultrasound liver lesion image is a challenging task. We approach fully automatic machine learning system for developing this classifier. First, we segment the liver image by calculating the textural features from co-occurrence matrix and run length method. For classification, Support Vector Machine is used based on the risk bounds of statistical learning theory. The textural features for different features methods are given as input to the SVM individually. Performance analysis train and test datasets carried out separately using SVM Model. Whenever an ultrasonic liver lesion image is given to the SVM classifier system, the features are calculated, classified, as normal and diseased liver lesion. We hope the result will be helpful to the physician to identify the liver cancer in non-invasive method.

Keywords: segmentation, Support Vector Machine, ultrasound liver lesion, co-occurance Matrix

Procedia PDF Downloads 388
1148 Histone Deacetylases Inhibitor - Valproic Acid Sensitizes Human Melanoma Cells for alkylating agent and PARP inhibitor

Authors: Małgorzata Drzewiecka, Tomasz Śliwiński, Maciej Radek

Abstract:

The inhibition of histone deacetyles (HDACs) holds promise as a potential anti-cancer therapy because histone and non-histone protein acetylation is frequently disrupted in cancer, leading to cancer initiation and progression. Additionally, histone deacetylase inhibitors (HDACi) such as class I HDAC inhibitor - valproic acid (VPA) have been shown to enhance the effectiveness of DNA-damaging factors, such as cisplatin or radiation. In this study, we found that, using of VPA in combination with talazoparib (BMN-637 – PARP1 inhibitor – PARPi) and/or Dacarabazine (DTIC - alkylating agent) resulted in increased DNA double strand break (DSB) and reduced survival (while not affecting primary melanocytes )and proliferation of melanoma cells. Furthermore, pharmacologic inhibition of class I HDACs sensitizes melanoma cells to apoptosis following exposure to DTIC and BMN-637. In addition, inhibition of HDAC caused sensitization of melanoma cells to dacarbazine and BMN-637 in melanoma xenografts in vivo. At the mRNA and protein level histone deacetylase inhibitor downregulated RAD51 and FANCD2. This study provides that combining HDACi, alkylating agent and PARPi could potentially enhance the treatment of melanoma, which is known for being one of the most aggressive malignant tumors. The findings presented here point to a scenario in which HDAC via enhancing the HR-dependent repair of DSBs created during the processing of DNA lesions, are essential nodes in the resistance of malignant melanoma cells to methylating agent-based therapies.

Keywords: melanoma, hdac, parp inhibitor, valproic acid

Procedia PDF Downloads 58
1147 Assessment of Heavy Metals and Radionuclide Concentrations in Mafikeng Waste Water Treatment Plant

Authors: M. Mathuthu, N. N. Gaxela, R. Y. Olobatoke

Abstract:

A study was carried out to assess the heavy metal and radionuclide concentrations of water from the waste water treatment plant in Mafikeng Local Municipality to evaluate treatment efficiency. Ten water samples were collected from various stages of water treatment which included sewage delivered to the plant, the two treatment stages and the effluent and also the community. The samples were analyzed for heavy metal content using Inductive Coupled Plasma Mass Spectrometer. Gross α/β activity concentration in water samples was evaluated by Liquid Scintillation Counting whereas the concentration of individual radionuclides was measured by gamma spectroscopy. The results showed marked reduction in the levels of heavy metal concentration from 3 µg/L (As)–670 µg/L (Na) in sewage into the plant to 2 µg/L (As)–170 µg/L (Fe) in the effluent. Beta activity was not detected in water samples except in the in-coming sewage, the concentration of which was within reference limits. However, the gross α activity in all the water samples (7.7-8.02 Bq/L) exceeded the 0.1 Bq/L limit set by World Health Organization (WHO). Gamma spectroscopy analysis revealed very high concentrations of 235U and 226Ra in water samples, with the lowest concentrations (9.35 and 5.44 Bq/L respectively) in the in-coming sewage and highest concentrations (73.8 and 47 Bq/L respectively) in the community water suggesting contamination along water processing line. All the values were considerably higher than the limits of South Africa Target Water Quality Range and WHO. However, the estimated total doses of the two radionuclides for the analyzed water samples (10.62 - 45.40 µSv yr-1) were all well below the reference level of the committed effective dose of 100 µSv yr-1 recommended by WHO.

Keywords: gross α/β activity, heavy metals, radionuclides, 235U, 226Ra, water sample

Procedia PDF Downloads 420
1146 Energy Efficient Massive Data Dissemination Through Vehicle Mobility in Smart Cities

Authors: Salman Naseer

Abstract:

One of the main challenges of operating a smart city (SC) is collecting the massive data generated from multiple data sources (DS) and to transmit them to the control units (CU) for further data processing and analysis. These ever-increasing data demands require not only more and more capacity of the transmission channels but also results in resource over-provision to meet the resilience requirements, thus the unavoidable waste because of the data fluctuations throughout the day. In addition, the high energy consumption (EC) and carbon discharges from these data transmissions posing serious issues to the environment we live in. Therefore, to overcome the issues of intensive EC and carbon emissions (CE) of massive data dissemination in Smart Cities, we propose an energy efficient and carbon reduction approach by utilizing the daily mobility of the existing vehicles as an alternative communications channel to accommodate the data dissemination in smart cities. To illustrate the effectiveness and efficiency of our approach, we take the Auckland City in New Zealand as an example, assuming massive data generated by various sources geographically scattered throughout the Auckland region to the control centres located in city centre. The numerical results show that our proposed approach can provide up to 5 times lower delay as transferring the large volume of data by utilizing the existing daily vehicles’ mobility than the conventional transmission network. Moreover, our proposed approach offers about 30% less EC and CE than that of conventional network transmission approach.

Keywords: smart city, delay tolerant network, infrastructure offloading, opportunistic network, vehicular mobility, energy consumption, carbon emission

Procedia PDF Downloads 120
1145 Early Depression Detection for Young Adults with a Psychiatric and AI Interdisciplinary Multimodal Framework

Authors: Raymond Xu, Ashley Hua, Andrew Wang, Yuru Lin

Abstract:

During COVID-19, the depression rate has increased dramatically. Young adults are most vulnerable to the mental health effects of the pandemic. Lower-income families have a higher ratio to be diagnosed with depression than the general population, but less access to clinics. This research aims to achieve early depression detection at low cost, large scale, and high accuracy with an interdisciplinary approach by incorporating clinical practices defined by American Psychiatric Association (APA) as well as multimodal AI framework. The proposed approach detected the nine depression symptoms with Natural Language Processing sentiment analysis and a symptom-based Lexicon uniquely designed for young adults. The experiments were conducted on the multimedia survey results from adolescents and young adults and unbiased Twitter communications. The result was further aggregated with the facial emotional cues analyzed by the Convolutional Neural Network on the multimedia survey videos. Five experiments each conducted on 10k data entries reached consistent results with an average accuracy of 88.31%, higher than the existing natural language analysis models. This approach can reach 300+ million daily active Twitter users and is highly accessible by low-income populations to promote early depression detection to raise awareness in adolescents and young adults and reveal complementary cues to assist clinical depression diagnosis.

Keywords: artificial intelligence, COVID-19, depression detection, psychiatric disorder

Procedia PDF Downloads 110
1144 Geographical Indication Protection for Agricultural Products: Contribution for Achieving Food Security in Indonesia

Authors: Mas Rahmah

Abstract:

Indonesia is the most populous Southeast Asian nations, as Indonesia`s population is constantly growing, food security has become a crucial trending issue. Although Indonesia has more than enough natural resources and agricultural products to ensure food security for all, Indonesia is still facing the problem of food security because of adverse weather conditions, increasing population, political instability, economic factors (unemployment, rising food prices), and the dependent system of agriculture. This paper will analyze that Geographical Indication (GI) can aid in transforming Indonesian agricultural-dependent system by tapping the unique product attributes of their quality products since Indonesia has a lot of agricultural products with unique quality and special characteristic associated with geographical factors such as Toraja Coffee, Alor Vanili, Banda Nutmeg, Java Tea, Deli Tobacco, Cianjur Rise etc. This paper argues that the reputation and agricultural products and their intrinsic quality should be protected under GI because GI will provide benefit supporting the food security program. Therefore, this paper will expose the benefit of GI protection such as increasing productivity, improving the exports of GI products, creating employment, adding economic value to products, and increasing the diversity of supply of natural and unique quality products, etc. that can contribute to food security. The analysis will finally conclude that the scenario of promoting GI may indirectly contribute to food security through adding value by incorporating territory specific cultural, environmental and social qualities into production, processing and developing of unique local, niche and special agricultural products.

Keywords: geographical indication, food security, agricultural product, Indonesia

Procedia PDF Downloads 351
1143 Evaluation of the Safety Status of Beef Meat During Processing at Slaughterhouse in Bouira, Algeria

Authors: A. Ameur Ameur, H. Boukherrouba

Abstract:

In red meat slaughterhouses a significant number of organs and carcasses were seized because of the presence of lesions of various origins. The objective of this study is to characterize and evaluate the frequency of these lesions in the slaughterhouse of the Wilaya of BOUIRA. On cattle slaughtered in 2646 and inspected 72% of these carcasses have been no seizures against 28% who have undergone at least one entry. 325 lung (44%), 164 livers (22%), 149 hearts (21%) are the main saisis.38 kidneys members (5%), 33 breasts (4%) and 16 whole carcasses (2%) are less seizures parties. The main reasons are the input hydatid cyst for most seized organs such as the lungs (64.5%), livers (51.8%), hearts (23.2%), hydronephrosis for the kidneys (39.4%), and chronic mastitis (54%) for the breasts. Then we recorded second-degree pneumonia (16%) to the lungs, chronic fascioliasis (25%) for livers. A significant difference was observed (p < 0.0001) by sex, race, origin and age of all cattle having been saisie.une a specific input patterns and So pathology was recorded based on race. The local breed presented (75.2%) of hydatid cyst, (95%) and chronic fascioliasis (60%) pyelonephritis, for against the improved breed presented the entire respiratory lesions include pneumonia (64%) the chronic tuberculosis (64%) and mastitis (76%). These results are an important step in the implementation of the concept of risk assessment as the scientific basis of food legislation, by the identification and characterization of macroscopic damage leading withdrawals in meat and to establish the level of inclusion of these injuries within the recommended risk assessment systems (HACCP).

Keywords: slaughterhouses, meat safety, seizure patterns, HACCP

Procedia PDF Downloads 434
1142 An Adaptive Back-Propagation Network and Kalman Filter Based Multi-Sensor Fusion Method for Train Location System

Authors: Yu-ding Du, Qi-lian Bao, Nassim Bessaad, Lin Liu

Abstract:

The Global Navigation Satellite System (GNSS) is regarded as an effective approach for the purpose of replacing the large amount used track-side balises in modern train localization systems. This paper describes a method based on the data fusion of a GNSS receiver sensor and an odometer sensor that can significantly improve the positioning accuracy. A digital track map is needed as another sensor to project two-dimensional GNSS position to one-dimensional along-track distance due to the fact that the train’s position can only be constrained on the track. A model trained by BP neural network is used to estimate the trend positioning error which is related to the specific location and proximate processing of the digital track map. Considering that in some conditions the satellite signal failure will lead to the increase of GNSS positioning error, a detection step for GNSS signal is applied. An adaptive weighted fusion algorithm is presented to reduce the standard deviation of train speed measurement. Finally an Extended Kalman Filter (EKF) is used for the fusion of the projected 1-D GNSS positioning data and the 1-D train speed data to get the estimate position. Experimental results suggest that the proposed method performs well, which can reduce positioning error notably.

Keywords: multi-sensor data fusion, train positioning, GNSS, odometer, digital track map, map matching, BP neural network, adaptive weighted fusion, Kalman filter

Procedia PDF Downloads 229
1141 Application of Interferometric Techniques for Quality Control Oils Used in the Food Industry

Authors: Andres Piña, Amy Meléndez, Pablo Cano, Tomas Cahuich

Abstract:

The purpose of this project is to propose a quick and environmentally friendly alternative to measure the quality of oils used in food industry. There is evidence that repeated and indiscriminate use of oils in food processing cause physicochemical changes with formation of potentially toxic compounds that can affect the health of consumers and cause organoleptic changes. In order to assess the quality of oils, non-destructive optical techniques such as Interferometry offer a rapid alternative to the use of reagents, using only the interaction of light on the oil. Through this project, we used interferograms of samples of oil placed under different heating conditions to establish the changes in their quality. These interferograms were obtained by means of a Mach-Zehnder Interferometer using a beam of light from a HeNe laser of 10mW at 632.8nm. Each interferogram was captured, analyzed and measured full width at half-maximum (FWHM) using the software from Amcap and ImageJ. The total of FWHMs was organized in three groups. It was observed that the average obtained from each of the FWHMs of group A shows a behavior that is almost linear, therefore it is probable that the exposure time is not relevant when the oil is kept under constant temperature. Group B exhibits a slight exponential model when temperature raises between 373 K and 393 K. Results of the t-Student show a probability of 95% (0.05) of the existence of variation in the molecular composition of both samples. Furthermore, we found a correlation between the Iodine Indexes (Physicochemical Analysis) and the Interferograms (Optical Analysis) of group C. Based on these results, this project highlights the importance of the quality of the oils used in food industry and shows how Interferometry can be a useful tool for this purpose.

Keywords: food industry, interferometric, oils, quality control

Procedia PDF Downloads 354
1140 Q-Map: Clinical Concept Mining from Clinical Documents

Authors: Sheikh Shams Azam, Manoj Raju, Venkatesh Pagidimarri, Vamsi Kasivajjala

Abstract:

Over the past decade, there has been a steep rise in the data-driven analysis in major areas of medicine, such as clinical decision support system, survival analysis, patient similarity analysis, image analytics etc. Most of the data in the field are well-structured and available in numerical or categorical formats which can be used for experiments directly. But on the opposite end of the spectrum, there exists a wide expanse of data that is intractable for direct analysis owing to its unstructured nature which can be found in the form of discharge summaries, clinical notes, procedural notes which are in human written narrative format and neither have any relational model nor any standard grammatical structure. An important step in the utilization of these texts for such studies is to transform and process the data to retrieve structured information from the haystack of irrelevant data using information retrieval and data mining techniques. To address this problem, the authors present Q-Map in this paper, which is a simple yet robust system that can sift through massive datasets with unregulated formats to retrieve structured information aggressively and efficiently. It is backed by an effective mining technique which is based on a string matching algorithm that is indexed on curated knowledge sources, that is both fast and configurable. The authors also briefly examine its comparative performance with MetaMap, one of the most reputed tools for medical concepts retrieval and present the advantages the former displays over the latter.

Keywords: information retrieval, unified medical language system, syntax based analysis, natural language processing, medical informatics

Procedia PDF Downloads 111
1139 Using Non-Negative Matrix Factorization Based on Satellite Imagery for the Collection of Agricultural Statistics

Authors: Benyelles Zakaria, Yousfi Djaafar, Karoui Moussa Sofiane

Abstract:

Agriculture is fundamental and remains an important objective in the Algerian economy, based on traditional techniques and structures, it generally has a purpose of consumption. Collection of agricultural statistics in Algeria is done using traditional methods, which consists of investigating the use of land through survey and field survey. These statistics suffer from problems such as poor data quality, the long delay between collection of their last final availability and high cost compared to their limited use. The objective of this work is to develop a processing chain for a reliable inventory of agricultural land by trying to develop and implement a new method of extracting information. Indeed, this methodology allowed us to combine data from remote sensing and field data to collect statistics on areas of different land. The contribution of remote sensing in the improvement of agricultural statistics, in terms of area, has been studied in the wilaya of Sidi Bel Abbes. It is in this context that we applied a method for extracting information from satellite images. This method is called the non-negative matrix factorization, which does not consider the pixel as a single entity, but will look for components the pixel itself. The results obtained by the application of the MNF were compared with field data and the results obtained by the method of maximum likelihood. We have seen a rapprochement between the most important results of the FMN and those of field data. We believe that this method of extracting information from satellite data leads to interesting results of different types of land uses.

Keywords: blind source separation, hyper-spectral image, non-negative matrix factorization, remote sensing

Procedia PDF Downloads 400
1138 Integrated Free Space Optical Communication and Optical Sensor Network System with Artificial Intelligence Techniques

Authors: Yibeltal Chanie Manie, Zebider Asire Munyelet

Abstract:

5G and 6G technology offers enhanced quality of service with high data transmission rates, which necessitates the implementation of the Internet of Things (IoT) in 5G/6G architecture. In this paper, we proposed the integration of free space optical communication (FSO) with fiber sensor networks for IoT applications. Recently, free-space optical communications (FSO) are gaining popularity as an effective alternative technology to the limited availability of radio frequency (RF) spectrum. FSO is gaining popularity due to flexibility, high achievable optical bandwidth, and low power consumption in several applications of communications, such as disaster recovery, last-mile connectivity, drones, surveillance, backhaul, and satellite communications. Hence, high-speed FSO is an optimal choice for wireless networks to satisfy the full potential of 5G/6G technology, offering 100 Gbit/s or more speed in IoT applications. Moreover, machine learning must be integrated into the design, planning, and optimization of future optical wireless communication networks in order to actualize this vision of intelligent processing and operation. In addition, fiber sensors are important to achieve real-time, accurate, and smart monitoring in IoT applications. Moreover, we proposed deep learning techniques to estimate the strain changes and peak wavelength of multiple Fiber Bragg grating (FBG) sensors using only the spectrum of FBGs obtained from the real experiment.

Keywords: optical sensor, artificial Intelligence, Internet of Things, free-space optics

Procedia PDF Downloads 38
1137 Fast Return Path Planning for Agricultural Autonomous Terrestrial Robot in a Known Field

Authors: Carlo Cernicchiaro, Pedro D. Gaspar, Martim L. Aguiar

Abstract:

The agricultural sector is becoming more critical than ever in view of the expected overpopulation of the Earth. The introduction of robotic solutions in this field is an increasingly researched topic to make the most of the Earth's resources, thus going to avoid the problems of wear and tear of the human body due to the harsh agricultural work, and open the possibility of a constant careful processing 24 hours a day. This project is realized for a terrestrial autonomous robot aimed to navigate in an orchard collecting fallen peaches below the trees. When it receives the signal indicating the low battery, it has to return to the docking station where it will replace its battery and then return to the last work point and resume its routine. Considering a preset path in orchards with tree rows with variable length by which the robot goes iteratively using the algorithm D*. In case of low battery, the D* algorithm is still used to determine the fastest return path to the docking station as well as to come back from the docking station to the last work point. MATLAB simulations were performed to analyze the flexibility and adaptability of the developed algorithm. The simulation results show an enormous potential for adaptability, particularly in view of the irregularity of orchard field, since it is not flat and undergoes modifications over time from fallen branch as well as from other obstacles and constraints. The D* algorithm determines the best route in spite of the irregularity of the terrain. Moreover, in this work, it will be shown a possible solution to improve the initial points tracking and reduce time between movements.

Keywords: path planning, fastest return path, agricultural autonomous terrestrial robot, docking station

Procedia PDF Downloads 118
1136 Comparative Study of the Effects of Process Parameters on the Yield of Oil from Melon Seed (Cococynthis citrullus) and Coconut Fruit (Cocos nucifera)

Authors: Ndidi F. Amulu, Patrick E. Amulu, Gordian O. Mbah, Callistus N. Ude

Abstract:

Comparative analysis of the properties of melon seed, coconut fruit and their oil yield were evaluated in this work using standard analytical technique AOAC. The results of the analysis carried out revealed that the moisture contents of the samples studied are 11.15% (melon) and 7.59% (coconut). The crude lipid content are 46.10% (melon) and 55.15% (coconut).The treatment combinations used (leaching time, leaching temperature and solute: solvent ratio) showed significant difference (p < 0.05) in yield between the samples, with melon oil seed flour having a higher percentage range of oil yield (41.30 – 52.90%) and coconut (36.25 – 49.83%). The physical characterization of the extracted oil was also carried out. The values gotten for refractive index are 1.487 (melon seed oil) and 1.361 (coconut oil) and viscosities are 0.008 (melon seed oil) and 0.002 (coconut oil). The chemical analysis of the extracted oils shows acid value of 1.00mg NaOH/g oil (melon oil), 10.050mg NaOH/g oil (coconut oil) and saponification value of 187.00mg/KOH (melon oil) and 183.26mg/KOH (coconut oil). The iodine value of the melon oil gave 75.00mg I2/g and 81.00mg I2/g for coconut oil. A standard statistical package Minitab version 16.0 was used in the regression analysis and analysis of variance (ANOVA). The statistical software mentioned above was also used to optimize the leaching process. Both samples gave high oil yield at the same optimal conditions. The optimal conditions to obtain highest oil yield ≥ 52% (melon seed) and ≥ 48% (coconut seed) are solute - solvent ratio of 40g/ml, leaching time of 2hours and leaching temperature of 50oC. The two samples studied have potential of yielding oil with melon seed giving the higher yield.

Keywords: Coconut, Melon, Optimization, Processing

Procedia PDF Downloads 420
1135 High Temperature Oxidation of Additively Manufactured Silicon Carbide/Carbon Fiber Nanocomposites

Authors: Saja M. Nabat Al-Ajrash, Charles Browning, Rose Eckerle, Li Cao, Robyn L. Bradford, Donald Klosterman

Abstract:

An additive manufacturing process and subsequent pyrolysis cycle were used to fabricate SiC matrix/carbon fiber hybrid composites. The matrix was fabricated using a mixture of preceramic polymer and acrylate monomers, while polyacrylonitrile (PAN) precursor was used to fabricate fibers via electrospinning. The precursor matrix and reinforcing fibers at 0, 2, 5, or 10 wt% were printed using digital light processing, and both were simultaneously pyrolyzed to yield the final ceramic matrix composite structure. After pyrolysis, XRD and SEAD analysis proved the existence of SiC nanocrystals and turbostratic carbon structure in the matrix, while the reinforcement phase was shown to have a turbostratic carbon structure similar to commercial carbon fibers. Thermogravimetric analysis (TGA) in the air up to 1400 °C was used to evaluate the oxidation resistance of this material. TGA results showed some weight loss due to oxidation of SiC and/or carbon up to about 900 °C, followed by weight gain to about 1200 °C due to the formation of a protective SiO2 layer. Although increasing carbon fiber content negatively impacted the total mass loss for the first heating cycle, exposure of the composite to second-run air revealed negligible weight chance. This is explained by SiO2 layer formation, which acts as a protective film that prevents oxygen diffusion. Oxidation of SiC and the formation of a glassy layer has been proven to protect the sample from further oxidation, as well as provide healing of surface cracks and defects, as revealed by SEM analysis.

Keywords: silicon carbide, carbon fibers, additive manufacturing, composite

Procedia PDF Downloads 54
1134 Identification of High-Rise Buildings Using Object Based Classification and Shadow Extraction Techniques

Authors: Subham Kharel, Sudha Ravindranath, A. Vidya, B. Chandrasekaran, K. Ganesha Raj, T. Shesadri

Abstract:

Digitization of urban features is a tedious and time-consuming process when done manually. In addition to this problem, Indian cities have complex habitat patterns and convoluted clustering patterns, which make it even more difficult to map features. This paper makes an attempt to classify urban objects in the satellite image using object-oriented classification techniques in which various classes such as vegetation, water bodies, buildings, and shadows adjacent to the buildings were mapped semi-automatically. Building layer obtained as a result of object-oriented classification along with already available building layers was used. The main focus, however, lay in the extraction of high-rise buildings using spatial technology, digital image processing, and modeling, which would otherwise be a very difficult task to carry out manually. Results indicated a considerable rise in the total number of buildings in the city. High-rise buildings were successfully mapped using satellite imagery, spatial technology along with logical reasoning and mathematical considerations. The results clearly depict the ability of Remote Sensing and GIS to solve complex problems in urban scenarios like studying urban sprawl and identification of more complex features in an urban area like high-rise buildings and multi-dwelling units. Object-Oriented Technique has been proven to be effective and has yielded an overall efficiency of 80 percent in the classification of high-rise buildings.

Keywords: object oriented classification, shadow extraction, high-rise buildings, satellite imagery, spatial technology

Procedia PDF Downloads 124
1133 Biochemical Characteristics and Microstructure of Ice Cream Prepared from Fresh Cream

Authors: S. Baississe, S. Godbane, A. Lekbir

Abstract:

The objective of our work is to develop an ice cream from a fermented cream, skim milk and other ingredients and follow the evolution of its physicochemical properties, biochemical and microstructure of the products obtained. Our cream is aerated with the manufacturing steps start with a homogenizing follow different ingredients by heating to 40°C emulsion, the preparation is then subjected to a heat treatment at 65°C for 30 min, before being stored in the cold at 4°C for a few hours. This conservation promotes crystallization of the material during the globular stage of maturation of the cream. The emulsifying agent moves gradually absorbed on the surface of fat globules homogeneous, which results in reduced protein stability. During the expansion, the collusion of destabilizing fat globules in the aqueous phase favours their coalescence. During the expansion, the collusion of destabilized fat globules in the aqueous phase favours their coalescence. The stabilizing agent increases the viscosity of the aqueous phase and the drainage limit interaction with the proteins of the aqueous phase and the protein absorbed on fat globules. The cutting improved organoleptic property of our cream is made by the use of three dyes and aromas. The products obtained undergo physicochemical analyses (pH, conductivity and acidity), biochemical (moisture, % dry matter and fat in %), and finally in the microscopic observation of the microstructure and the results obtained by analysis of the image processing software. The results show a remarkable evolution of physicochemical properties (pH, conductivity and acidity), biochemical (moisture, fat and non-fat) and microstructure of the products developed in relation to the raw material (skim milk) and the intermediate product (fermented cream).

Keywords: ice cream, sour cream, physicochemical, biochemical, microstructure

Procedia PDF Downloads 185
1132 Heuristic Spatial-Spectral Hyperspectral Image Segmentation Using Bands Quartile Box Plot Profiles

Authors: Mohamed A. Almoghalis, Osman M. Hegazy, Ibrahim F. Imam, Ali H. Elbastawessy

Abstract:

This paper presents a new hyperspectral image segmentation scheme with respect to both spatial and spectral contexts. The scheme uses the 8-pixels spatial pattern to build a weight structure that holds the number of outlier bands for each pixel among its neighborhood windows in different directions. The number of outlier bands for a pixel is obtained using bands quartile box plots profile among spatial 8-pixels pattern windows. The quartile box plot weight structure represents the spatial-spectral context in the image. Instead of starting segmentation process by single pixels, the proposed methodology starts by pixels groups that proved to share the same spectral features with respect to their spatial context. As a result, the segmentation scheme starts with Jigsaw pieces that build a mosaic image. The following step builds a model for each Jigsaw piece in the mosaic image. Each Jigsaw piece will be merged with another Jigsaw piece using KNN applied to their bands' quartile box plots profiles. The scheme iterates till required number of segments reached. Experiments use two data sets obtained from Earth Observer 1 (EO-1) sensor for Egypt and France. Initial results qualitative analysis showed encouraging results compared with ground truth. Quantitative analysis for the results will be included in the final paper.

Keywords: hyperspectral image segmentation, image processing, remote sensing, box plot

Procedia PDF Downloads 581
1131 Customer Satisfaction with Artificial Intelligence-Based Service in Catering Industry: Empirical Study on Smart Kiosks

Authors: Mai Anh Tuan, Wenlong Liu, Meng Li

Abstract:

Despite warnings and concerns about the use of fast food that has health effects, the fast-food industry is actually a source of profit for the global food industry. Obviously, in the face of such huge economic benefits, investors will not hesitate to continuously add recipes, processing methods, menu diversity, etc., to improve and apply information technology in enhancing the diners' experience; the ultimate goal is still to attract diners to find their brand and give them the fastest, most convenient and enjoyable service. In China, as the achievements of the industrial revolution 4.0, big data and artificial intelligence are reaching new heights day by day, now fast-food diners can instantly pay the bills only by identifying the biometric signature available on the self-ordering kiosk, using their own face without any additional form of confirmation. In this study, the author will evaluate the acceptance level of customers with this new form of payment through a survey of customers who have used and witnessed the use of smart kiosks and biometric payments within the city of Nanjing, China. A total of 200 valid volunteers were collected in order to test the customers' intentions and feelings when choosing and experiencing payment through AI services. 55% think that it bothers them because of the need for personal information, but more than 70% think that smart kiosk brings out many benefits and convenience. According to the data analysis findings, perceived innovativeness has a positive influence on satisfaction which in turn affects behavioral intentions, including reuse and word-of-mouth intentions.

Keywords: artificial intelligence, catering industry, smart kiosks, technology acceptance

Procedia PDF Downloads 77
1130 Natural Antioxidant Changes in Fresh and Dried Spices and Vegetables

Authors: Liga Priecina, Daina Karklina

Abstract:

Antioxidants are became the most analyzed substances in last decades. Antioxidants act as in activator for free radicals. Spices and vegetables are one of major antioxidant sources. Most common antioxidants in vegetables and spices are vitamin C, E, phenolic compounds, carotenoids. Therefore, it is important to get some view about antioxidant changes in spices and vegetables during processing. In this article was analyzed nine fresh and dried spices and vegetables- celery (Apium graveolens), parsley (Petroselinum crispum), dill (Anethum graveolens), leek (Allium ampeloprasum L.), garlic (Allium sativum L.), onion (Allium cepa), celery root (Apium graveolens var. rapaceum), pumpkin (Curcubica maxima), carrot (Daucus carota)- grown in Latvia 2013. Total carotenoids and phenolic compounds and their antiradical scavenging activity were determined for all samples. Dry matter content was calculated from moisture content. After drying process carotenoid content significantly decreases in all analyzed samples, except one -carotenoid content increases in parsley. Phenolic composition was different and depends on sample – fresh or dried. Total phenolic, flavonoid and phenolic acid content increases in dried spices. Flavan-3-ol content is not detected in fresh spice samples. For dried vegetables- phenolic acid content decreases significantly, but increases flavan-3-ols content. The higher antiradical scavenging activity was observed in samples with higher flavonoid and phenolic acid content.

Keywords: antiradical scavenging activity, carotenoids, phenolic compounds, spices, vegetables

Procedia PDF Downloads 246
1129 Transformation of Positron Emission Tomography Raw Data into Images for Classification Using Convolutional Neural Network

Authors: Paweł Konieczka, Lech Raczyński, Wojciech Wiślicki, Oleksandr Fedoruk, Konrad Klimaszewski, Przemysław Kopka, Wojciech Krzemień, Roman Shopa, Jakub Baran, Aurélien Coussat, Neha Chug, Catalina Curceanu, Eryk Czerwiński, Meysam Dadgar, Kamil Dulski, Aleksander Gajos, Beatrix C. Hiesmayr, Krzysztof Kacprzak, łukasz Kapłon, Grzegorz Korcyl, Tomasz Kozik, Deepak Kumar, Szymon Niedźwiecki, Dominik Panek, Szymon Parzych, Elena Pérez Del Río, Sushil Sharma, Shivani Shivani, Magdalena Skurzok, Ewa łucja Stępień, Faranak Tayefi, Paweł Moskal

Abstract:

This paper develops the transformation of non-image data into 2-dimensional matrices, as a preparation stage for classification based on convolutional neural networks (CNNs). In positron emission tomography (PET) studies, CNN may be applied directly to the reconstructed distribution of radioactive tracers injected into the patient's body, as a pattern recognition tool. Nonetheless, much PET data still exists in non-image format and this fact opens a question on whether they can be used for training CNN. In this contribution, the main focus of this paper is the problem of processing vectors with a small number of features in comparison to the number of pixels in the output images. The proposed methodology was applied to the classification of PET coincidence events.

Keywords: convolutional neural network, kernel principal component analysis, medical imaging, positron emission tomography

Procedia PDF Downloads 113
1128 CT Medical Images Denoising Based on New Wavelet Thresholding Compared with Curvelet and Contourlet

Authors: Amir Moslemi, Amir movafeghi, Shahab Moradi

Abstract:

One of the most important challenging factors in medical images is nominated as noise.Image denoising refers to the improvement of a digital medical image that has been infected by Additive White Gaussian Noise (AWGN). The digital medical image or video can be affected by different types of noises. They are impulse noise, Poisson noise and AWGN. Computed tomography (CT) images are subjected to low quality due to the noise. The quality of CT images is dependent on the absorbed dose to patients directly in such a way that increase in absorbed radiation, consequently absorbed dose to patients (ADP), enhances the CT images quality. In this manner, noise reduction techniques on the purpose of images quality enhancement exposing no excess radiation to patients is one the challenging problems for CT images processing. In this work, noise reduction in CT images was performed using two different directional 2 dimensional (2D) transformations; i.e., Curvelet and Contourlet and Discrete wavelet transform(DWT) thresholding methods of BayesShrink and AdaptShrink, compared to each other and we proposed a new threshold in wavelet domain for not only noise reduction but also edge retaining, consequently the proposed method retains the modified coefficients significantly that result in good visual quality. Data evaluations were accomplished by using two criterions; namely, peak signal to noise ratio (PSNR) and Structure similarity (Ssim).

Keywords: computed tomography (CT), noise reduction, curve-let, contour-let, signal to noise peak-peak ratio (PSNR), structure similarity (Ssim), absorbed dose to patient (ADP)

Procedia PDF Downloads 421
1127 Combining Laser Scanning and High Dynamic Range Photography for the Presentation of Bloodstain Pattern Evidence

Authors: Patrick Ho

Abstract:

Bloodstain Pattern Analysis (BPA) forensic evidence can be complex, requiring effective courtroom presentation to ensure clear and comprehensive understanding of the analyst’s findings. BPA witness statements can often involve reference to spatial information (such as location of rooms, objects, walls) which, when coupled with classified blood patterns, may illustrate the reconstructed movements of suspects and injured parties. However, it may be difficult to communicate this information through photography alone, despite this remaining the UK’s established method for presenting BPA evidence. Through an academic-police partnership between the University of Warwick and West Midlands Police (WMP), an integrated 3D scanning and HDR photography workflow for BPA was developed. Homicide scenes were laser scanned and, after processing, the 3D models were utilised in the BPA peer-review process. The same 3D models were made available for court but were not always utilised. This workflow has improved the ease of presentation for analysts and provided 3D scene models that assist with the investigation. However, the effects of incorporating 3D scene models in judicial processes may need to be studied before they are adopted more widely. 3D models from a simulated crime scene and West Midlands Police cases approved for conference disclosure are presented. We describe how the workflow was developed and integrated into established practices at WMP, including peer-review processes and witness statement delivery in court, and explain the impact the work has had on the Criminal Justice System in the West Midlands.

Keywords: bloodstain pattern analysis, forensic science, criminal justice, 3D scanning

Procedia PDF Downloads 69
1126 Improve of Biomass Properties through Torrefaction Process

Authors: Malgorzata Walkowiak, Magdalena Witczak, Wojciech Cichy

Abstract:

Biomass is an important renewable energy source in Poland. As a biofuel, it has many advantages like renewable in noticeable time and relatively high energy potential. But disadvantages of biomass like high moisture content and hygroscopic nature causes that gaining, transport, storage and preparation for combustion become troublesome and uneconomic. Thermal modification of biomass can improve hydrophobic properties, increase its calorific value and natural resistance. This form of thermal processing is known as torrefaction. The aim of the study was to investigate the effect of the pre-heat treatment of wood and plant lignocellulosic raw materials on the properties of solid biofuels. The preliminary studies included pine, beech and willow wood and other lignocellulosic raw materials: mustard, hemp, grass stems, tobacco stalks, sunflower husks, Miscanthus straw, rape straw, cereal straw, Virginia Mallow straw, rapeseed meal. Torrefaction was carried out using variable temperatures and time of the process, depending on the material used. It was specified the weight loss and the ash content and calorific value was determined. It was found that the thermal treatment of the tested lignocellulosic raw materials is able to provide solid biofuel with improved properties. In the woody materials, the increase of the lower heating value was in the range of 0,3 MJ/kg (pine and beech) to 1,1 MJ/kg (willow), in non-woody materials – from 0,5 MJ/kg (tobacco stalks, Miscanthus) to 3,5 MJ/kg (rapeseed meal). The obtained results indicate for further research needs, particularly in terms of conditions of the torrefaction process.

Keywords: biomass, lignocellulosic materials, solid biofuels, torrefaction

Procedia PDF Downloads 217
1125 A Mixing Matrix Estimation Algorithm for Speech Signals under the Under-Determined Blind Source Separation Model

Authors: Jing Wu, Wei Lv, Yibing Li, Yuanfan You

Abstract:

The separation of speech signals has become a research hotspot in the field of signal processing in recent years. It has many applications and influences in teleconferencing, hearing aids, speech recognition of machines and so on. The sounds received are usually noisy. The issue of identifying the sounds of interest and obtaining clear sounds in such an environment becomes a problem worth exploring, that is, the problem of blind source separation. This paper focuses on the under-determined blind source separation (UBSS). Sparse component analysis is generally used for the problem of under-determined blind source separation. The method is mainly divided into two parts. Firstly, the clustering algorithm is used to estimate the mixing matrix according to the observed signals. Then the signal is separated based on the known mixing matrix. In this paper, the problem of mixing matrix estimation is studied. This paper proposes an improved algorithm to estimate the mixing matrix for speech signals in the UBSS model. The traditional potential algorithm is not accurate for the mixing matrix estimation, especially for low signal-to noise ratio (SNR).In response to this problem, this paper considers the idea of an improved potential function method to estimate the mixing matrix. The algorithm not only avoids the inuence of insufficient prior information in traditional clustering algorithm, but also improves the estimation accuracy of mixing matrix. This paper takes the mixing of four speech signals into two channels as an example. The results of simulations show that the approach in this paper not only improves the accuracy of estimation, but also applies to any mixing matrix.

Keywords: DBSCAN, potential function, speech signal, the UBSS model

Procedia PDF Downloads 115
1124 Production of Cellulose Nanowhiskers from Red Algae Waste and Its Application in Polymer Composite Development

Authors: Z. Kassab, A. Aboulkas, A. Barakat, M. El Achaby

Abstract:

The red algae are available enormously around the world and their exploitation for the production of agar product has become as an important industry in recent years. However, this industrial processing of red algae generated a large quantity of solid fibrous wastes, which constitute a source of a serious environmental problem. For this reason, the exploitation of this solid waste would help to i) produce new value-added materials and ii) to improve waste disposal from environment. In fact, this solid waste can be fully utilized for the production of cellulose microfibers and nanocrystals because it consists of large amount of cellulose component. For this purpose, the red algae waste was chemically treated via alkali, bleaching and acid hydrolysis treatments with controlled conditions, in order to obtain pure cellulose microfibers and cellulose nanocrystals. The raw product and the as-extracted cellulosic materials were successively characterized using serval analysis techniques, including elemental analysis, X-ray diffraction, thermogravimetric analysis, infrared spectroscopy and transmission electron microscopy. As an application, the as extracted cellulose nanocrystals were used as nanofillers for the production of polymer-based composite films with improved thermal and tensile properties. In these composite materials, the adhesion properties and the large number of functional groups that are presented in the CNC’s surface and the macromolecular chains of the polymer matrix are exploited to improve the interfacial interactions between the both phases, improving the final properties. Consequently, the high performances of these composite materials can be expected to have potential in packaging material applications.

Keywords: cellulose nanowhiskers, food packaging, polymer composites, red algae waste

Procedia PDF Downloads 206
1123 Semantic Indexing Improvement for Textual Documents: Contribution of Classification by Fuzzy Association Rules

Authors: Mohsen Maraoui

Abstract:

In the aim of natural language processing applications improvement, such as information retrieval, machine translation, lexical disambiguation, we focus on statistical approach to semantic indexing for multilingual text documents based on conceptual network formalism. We propose to use this formalism as an indexing language to represent the descriptive concepts and their weighting. These concepts represent the content of the document. Our contribution is based on two steps. In the first step, we propose the extraction of index terms using the multilingual lexical resource Euro WordNet (EWN). In the second step, we pass from the representation of index terms to the representation of index concepts through conceptual network formalism. This network is generated using the EWN resource and pass by a classification step based on association rules model (in attempt to discover the non-taxonomic relations or contextual relations between the concepts of a document). These relations are latent relations buried in the text and carried by the semantic context of the co-occurrence of concepts in the document. Our proposed indexing approach can be applied to text documents in various languages because it is based on a linguistic method adapted to the language through a multilingual thesaurus. Next, we apply the same statistical process regardless of the language in order to extract the significant concepts and their associated weights. We prove that the proposed indexing approach provides encouraging results.

Keywords: concept extraction, conceptual network formalism, fuzzy association rules, multilingual thesaurus, semantic indexing

Procedia PDF Downloads 123