Search results for: microwave-assisted air drying processing
2721 A Novel Machine Learning Approach to Aid Agrammatism in Non-fluent Aphasia
Authors: Rohan Bhasin
Abstract:
Agrammatism in non-fluent Aphasia Cases can be defined as a language disorder wherein a patient can only use content words ( nouns, verbs and adjectives ) for communication and their speech is devoid of functional word types like conjunctions and articles, generating speech of with extremely rudimentary grammar . Past approaches involve Speech Therapy of some order with conversation analysis used to analyse pre-therapy speech patterns and qualitative changes in conversational behaviour after therapy. We describe this approach as a novel method to generate functional words (prepositions, articles, ) around content words ( nouns, verbs and adjectives ) using a combination of Natural Language Processing and Deep Learning algorithms. The applications of this approach can be used to assist communication. The approach the paper investigates is : LSTMs or Seq2Seq: A sequence2sequence approach (seq2seq) or LSTM would take in a sequence of inputs and output sequence. This approach needs a significant amount of training data, with each training data containing pairs such as (content words, complete sentence). We generate such data by starting with complete sentences from a text source, removing functional words to get just the content words. However, this approach would require a lot of training data to get a coherent input. The assumptions of this approach is that the content words received in the inputs of both text models are to be preserved, i.e, won't alter after the functional grammar is slotted in. This is a potential limit to cases of severe Agrammatism where such order might not be inherently correct. The applications of this approach can be used to assist communication mild Agrammatism in non-fluent Aphasia Cases. Thus by generating these function words around the content words, we can provide meaningful sentence options to the patient for articulate conversations. Thus our project translates the use case of generating sentences from content-specific words into an assistive technology for non-Fluent Aphasia Patients.Keywords: aphasia, expressive aphasia, assistive algorithms, neurology, machine learning, natural language processing, language disorder, behaviour disorder, sequence to sequence, LSTM
Procedia PDF Downloads 1622720 Detection of Phoneme [S] Mispronounciation for Sigmatism Diagnosis in Adults
Authors: Michal Krecichwost, Zauzanna Miodonska, Pawel Badura
Abstract:
The diagnosis of sigmatism is mostly based on the observation of articulatory organs. It is, however, not always possible to precisely observe the vocal apparatus, in particular in the oral cavity of the patient. Speech processing can allow to objectify the therapy and simplify the verification of its progress. In the described study the methodology for classification of incorrectly pronounced phoneme [s] is proposed. The recordings come from adults. They were registered with the speech recorder at the sampling rate of 44.1 kHz and the resolution of 16 bit. The database of pathological and normative speech has been collected for the study including reference assessments provided by the speech therapy experts. Ten adult subjects were asked to simulate a certain type of stigmatism under the speech therapy expert supervision. In the recordings, the analyzed phone [s] was surrounded by vowels, viz: ASA, ESE, ISI, SPA, USU, YSY. Thirteen MFCC (mel-frequency cepstral coefficients) and RMS (root mean square) values are calculated within each frame being a part of the analyzed phoneme. Additionally, 3 fricative formants along with corresponding amplitudes are determined for the entire segment. In order to aggregate the information within the segment, the average value of each MFCC coefficient is calculated. All features of other types are aggregated by means of their 75th percentile. The proposed method of features aggregation reduces the size of the feature vector used in the classification. Binary SVM (support vector machine) classifier is employed at the phoneme recognition stage. The first group consists of pathological phones, while the other of the normative ones. The proposed feature vector yields classification sensitivity and specificity measures above 90% level in case of individual logo phones. The employment of a fricative formants-based information improves the sole-MFCC classification results average of 5 percentage points. The study shows that the employment of specific parameters for the selected phones improves the efficiency of pathology detection referred to the traditional methods of speech signal parameterization.Keywords: computer-aided pronunciation evaluation, sibilants, sigmatism diagnosis, speech processing
Procedia PDF Downloads 2822719 From Industry 4.0 to Agriculture 4.0: A Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability
Authors: Angelo Corallo, Maria Elena Latino, Marta Menegoli
Abstract:
Agri-food value chain involves various stakeholders with different roles. All of them abide by national and international rules and leverage marketing strategies to advance their products. Food products and related processing phases carry with it a big mole of data that are often not used to inform final customer. Some data, if fittingly identified and used, can enhance the single company, and/or the all supply chain creates a math between marketing techniques and voluntary traceability strategies. Moreover, as of late, the world has seen buying-models’ modification: customer is careful on wellbeing and food quality. Food citizenship and food democracy was born, leveraging on transparency, sustainability and food information needs. Internet of Things (IoT) and Analytics, some of the innovative technologies of Industry 4.0, have a significant impact on market and will act as a main thrust towards a genuine ‘4.0 change’ for agriculture. But, realizing a traceability system is not simple because of the complexity of agri-food supply chain, a lot of actors involved, different business models, environmental variations impacting products and/or processes, and extraordinary climate changes. In order to give support to the company involved in a traceability path, starting from business model analysis and related business process a Framework to Manage Product Data in Agri-Food Supply Chain for Voluntary Traceability was conceived. Studying each process task and leveraging on modeling techniques lead to individuate information held by different actors during agri-food supply chain. IoT technologies for data collection and Analytics techniques for data processing supply information useful to increase the efficiency intra-company and competitiveness in the market. The whole information recovered can be shown through IT solutions and mobile application to made accessible to the company, the entire supply chain and the consumer with the view to guaranteeing transparency and quality.Keywords: agriculture 4.0, agri-food suppy chain, industry 4.0, voluntary traceability
Procedia PDF Downloads 1462718 Study of Rehydration Process of Dried Squash (Cucurbita pepo) at Different Temperatures and Dry Matter-Water Ratios
Authors: Sima Cheraghi Dehdezi, Nasser Hamdami
Abstract:
Air-drying is the most widely employed method for preserving fruits and vegetables. Most of the dried products must be rehydrated by immersion in water prior to their use, so the study of rehydration kinetics in order to optimize rehydration phenomenon has great importance. Rehydration typically composes of three simultaneous processes: the imbibition of water into dried material, the swelling of the rehydrated products and the leaching of soluble solids to rehydration medium. In this research, squash (Cucurbita pepo) fruits were cut into 0.4 cm thick and 4 cm diameter slices. Then, squash slices were blanched in a steam chamber for 4 min. After cooling to room temperature, squash slices were dehydrated in a hot air dryer, under air flow 1.5 m/s and air temperature of 60°C up to moisture content of 0.1065 kg H2O per kg d.m. Dehydrated samples were kept in polyethylene bags and stored at 4°C. Squash slices with specified weight were rehydrated by immersion in distilled water at different temperatures (25, 50, and 75°C), various dry matter-water ratios (1:25, 1:50, and 1:100), which was agitated at 100 rpm. At specified time intervals, up to 300 min, the squash samples were removed from the water, and the weight, moisture content and rehydration indices of the sample were determined.The texture characteristics were examined over a 180 min period. The results showed that rehydration time and temperature had significant effects on moisture content, water absorption capacity (WAC), dry matter holding capacity (DHC), rehydration ability (RA), maximum force and stress in dried squash slices. Dry matter-water ratio had significant effect (p˂0.01) on all squash slice properties except DHC. Moisture content, WAC and RA of squash slices increased, whereas DHC and texture firmness (maximum force and stress) decreased with rehydration time. The maximum moisture content, WAC and RA and the minimum DHC, force and stress, were observed in squash slices rehydrated into 75°C water. The lowest moisture content, WAC and RA and the highest DHC, force and stress, were observed in squash slices immersed in water at 1:100 dry matter-water ratio. In general, for all rehydration conditions of squash slices, the highest water absorption rate occurred during the first minutes of process. Then, this rate decreased. The highest rehydration rate and amount of water absorption occurred in 75°C.Keywords: dry matter-water ratio, squash, maximum force, rehydration ability
Procedia PDF Downloads 3122717 Technical and Practical Aspects of Sizing a Autonomous PV System
Authors: Abdelhak Bouchakour, Mustafa Brahami, Layachi Zaghba
Abstract:
The use of photovoltaic energy offers an inexhaustible supply of energy but also a clean and non-polluting energy, which is a definite advantage. The geographical location of Algeria promotes the development of the use of this energy. Indeed, given the importance of the intensity of the radiation received and the duration of sunshine. For this reason, the objective of our work is to develop a data-processing tool (software) of calculation and optimization of dimensioning of the photovoltaic installations. Our approach of optimization is basing on mathematical models, which amongst other things describe the operation of each part of the installation, the energy production, the storage and the consumption of energy.Keywords: solar panel, solar radiation, inverter, optimization
Procedia PDF Downloads 6062716 Creating Energy Sustainability in an Enterprise
Authors: John Lamb, Robert Epstein, Vasundhara L. Bhupathi, Sanjeev Kumar Marimekala
Abstract:
As we enter the new era of Artificial Intelligence (AI) and Cloud Computing, we mostly rely on the Machine and Natural Language Processing capabilities of AI, and Energy Efficient Hardware and Software Devices in almost every industry sector. In these industry sectors, much emphasis is on developing new and innovative methods for producing and conserving energy and sustaining the depletion of natural resources. The core pillars of sustainability are economic, environmental, and social, which is also informally referred to as the 3 P's (People, Planet and Profits). The 3 P's play a vital role in creating a core Sustainability Model in the Enterprise. Natural resources are continually being depleted, so there is more focus and growing demand for renewable energy. With this growing demand, there is also a growing concern in many industries on how to reduce carbon emissions and conserve natural resources while adopting sustainability in corporate business models and policies. In our paper, we would like to discuss the driving forces such as Climate changes, Natural Disasters, Pandemic, Disruptive Technologies, Corporate Policies, Scaled Business Models and Emerging social media and AI platforms that influence the 3 main pillars of Sustainability (3P’s). Through this paper, we would like to bring an overall perspective on enterprise strategies and the primary focus on bringing cultural shifts in adapting energy-efficient operational models. Overall, many industries across the globe are incorporating core sustainability principles such as reducing energy costs, reducing greenhouse gas (GHG) emissions, reducing waste and increasing recycling, adopting advanced monitoring and metering infrastructure, reducing server footprint and compute resources (Shared IT services, Cloud computing, and Application Modernization) with the vision for a sustainable environment.Keywords: climate change, pandemic, disruptive technology, government policies, business model, machine learning and natural language processing, AI, social media platform, cloud computing, advanced monitoring, metering infrastructure
Procedia PDF Downloads 1102715 Automatic Identification of Pectoral Muscle
Authors: Ana L. M. Pavan, Guilherme Giacomini, Allan F. F. Alves, Marcela De Oliveira, Fernando A. B. Neto, Maria E. D. Rosa, Andre P. Trindade, Diana R. De Pina
Abstract:
Mammography is a worldwide image modality used to diagnose breast cancer, even in asymptomatic women. Due to its large availability, mammograms can be used to measure breast density and to predict cancer development. Women with increased mammographic density have a four- to sixfold increase in their risk of developing breast cancer. Therefore, studies have been made to accurately quantify mammographic breast density. In clinical routine, radiologists perform image evaluations through BIRADS (Breast Imaging Reporting and Data System) assessment. However, this method has inter and intraindividual variability. An automatic objective method to measure breast density could relieve radiologist’s workload by providing a first aid opinion. However, pectoral muscle is a high density tissue, with similar characteristics of fibroglandular tissues. It is consequently hard to automatically quantify mammographic breast density. Therefore, a pre-processing is needed to segment the pectoral muscle which may erroneously be quantified as fibroglandular tissue. The aim of this work was to develop an automatic algorithm to segment and extract pectoral muscle in digital mammograms. The database consisted of thirty medio-lateral oblique incidence digital mammography from São Paulo Medical School. This study was developed with ethical approval from the authors’ institutions and national review panels under protocol number 3720-2010. An algorithm was developed, in Matlab® platform, for the pre-processing of images. The algorithm uses image processing tools to automatically segment and extract the pectoral muscle of mammograms. Firstly, it was applied thresholding technique to remove non-biological information from image. Then, the Hough transform is applied, to find the limit of the pectoral muscle, followed by active contour method. Seed of active contour is applied in the limit of pectoral muscle found by Hough transform. An experienced radiologist also manually performed the pectoral muscle segmentation. Both methods, manual and automatic, were compared using the Jaccard index and Bland-Altman statistics. The comparison between manual and the developed automatic method presented a Jaccard similarity coefficient greater than 90% for all analyzed images, showing the efficiency and accuracy of segmentation of the proposed method. The Bland-Altman statistics compared both methods in relation to area (mm²) of segmented pectoral muscle. The statistic showed data within the 95% confidence interval, enhancing the accuracy of segmentation compared to the manual method. Thus, the method proved to be accurate and robust, segmenting rapidly and freely from intra and inter-observer variability. It is concluded that the proposed method may be used reliably to segment pectoral muscle in digital mammography in clinical routine. The segmentation of the pectoral muscle is very important for further quantifications of fibroglandular tissue volume present in the breast.Keywords: active contour, fibroglandular tissue, hough transform, pectoral muscle
Procedia PDF Downloads 3502714 Occupational Heat Stress Condition According to Wet Bulb Globe Temperature Index in Textile Processing Unit: A Case Study of Surat, Gujarat, India
Authors: Dharmendra Jariwala, Robin Christian
Abstract:
Thermal exposure is a common problem in every manufacturing industry where heat is used in the manufacturing process. In developing countries like India, a lack of awareness regarding the proper work environmental condition is observed among workers. Improper planning of factory building, arrangement of machineries, ventilation system, etc. play a vital role in the rise of temperature within the manufacturing areas. Due to the uncontrolled thermal stress, workers may be subjected to various heat illnesses from mild disorder to heat stroke. Heat stress is responsible for the health risk and reduction in production. Wet Bulb Globe Temperature (WBGT) index and relative humidity are used to evaluate heat stress conditions. WBGT index is a weighted average of natural wet bulb temperature, globe temperature, dry bulb temperature, which are measured with standard instrument QuestTemp 36 area stress monitor. In this study textile processing units have been selected in the industrial estate in the Surat city. Based on the manufacturing process six locations were identified within the plant at which process was undertaken at 120°C to 180°C. These locations were jet dying machine area, stenter machine area, printing machine, looping machine area, washing area which generate process heat. Office area was also selected for comparision purpose as a sixth location. Present Study was conducted in the winter season and summer season for day and night shift. The results shows that average WBGT index was found above Threshold Limiting Value (TLV) during summer season for day and night shift in all three industries except office area. During summer season highest WBGT index of 32.8°C was found during day shift and 31.5°C was found during night shift at printing machine area. Also during winter season highest WBGT index of 30°C and 29.5°C was found at printing machine area during day shift and night shift respectively.Keywords: relative humidity, textile industry, thermal stress, WBGT
Procedia PDF Downloads 1712713 Emotional Processing Difficulties in Recovered Anorexia Nervosa Patients: State or Trait
Authors: Telma Fontao de Castro, Kylee Miller, Maria Xavier Araújo, Isabel Brandao, Sandra Torres
Abstract:
Objective: There is a dearth of research investigating the long-term emotional functioning of individuals recovered from anorexia nervosa (AN). This 15-year longitudinal study aimed to examine whether difficulties in cognitive processing of emotions persisted after long-term AN recovery and its link to anxiety and depression. Method: Twenty-four females, who were tested longitudinally during their acute and recovered AN phases, and 24 healthy control (HC) women, were screened for anxiety, depression, alexithymia, and emotion regulation difficulties (ER; only assessed in recovery phase). Results: Anxiety, depression, and alexithymia levels decreased significantly with AN recovery. However, scores on anxiety and difficulty in identifying feelings (alexithymia factor) remained high when compared to the HC group. Scores on emotion regulation difficulties were also lower in HC group. The abovementioned differences between AN recovered group and HC group in difficulties in identifying and accepting feelings and lack of emotional clarity were no longer present when the effect of anxiety and depression was controlled. Conclusions: Findings suggest that emotional dysfunction tends to decrease in AN recovered phase. However, using an HC group as a reference, we conclude that several emotional difficulties are still increased after long-term AN recovery, in particular, limited access to emotion regulation strategies, and difficulty controlling impulses and engaging in goal-directed behavior, thus suggesting to be a trait vulnerability. In turn, competencies related to emotional clarity and acceptance of emotional responses seem to be state-dependent phenomena linked to anxiety and depression. In sum, managing emotions remains a challenge for individuals recovered from AN. Under this circumstance, maladaptive eating behavior can serve as an affect regulatory function, increasing the risk of relapse. Emotional education and stabilization of depressive and anxious symptomatology after recovery emerge as an important avenue to protect from long-term AN relapse.Keywords: alexithymia, anorexia nervosa, emotion recognition, emotion regulation
Procedia PDF Downloads 1232712 Developing Manufacturing Process for the Graphene Sensors
Authors: Abdullah Faqihi, John Hedley
Abstract:
Biosensors play a significant role in the healthcare sectors, scientific and technological progress. Developing electrodes that are easy to manufacture and deliver better electrochemical performance is advantageous for diagnostics and biosensing. They can be implemented extensively in various analytical tasks such as drug discovery, food safety, medical diagnostics, process controls, security and defence, in addition to environmental monitoring. Development of biosensors aims to create high-performance electrochemical electrodes for diagnostics and biosensing. A biosensor is a device that inspects the biological and chemical reactions generated by the biological sample. A biosensor carries out biological detection via a linked transducer and transmits the biological response into an electrical signal; stability, selectivity, and sensitivity are the dynamic and static characteristics that affect and dictate the quality and performance of biosensors. In this research, a developed experimental study for laser scribing technique for graphene oxide inside a vacuum chamber for processing of graphene oxide is presented. The processing of graphene oxide (GO) was achieved using the laser scribing technique. The effect of the laser scribing on the reduction of GO was investigated under two conditions: atmosphere and vacuum. GO solvent was coated onto a LightScribe DVD. The laser scribing technique was applied to reduce GO layers to generate rGO. The micro-details for the morphological structures of rGO and GO were visualised using scanning electron microscopy (SEM) and Raman spectroscopy so that they could be examined. The first electrode was a traditional graphene-based electrode model, made under normal atmospheric conditions, whereas the second model was a developed graphene electrode fabricated under a vacuum state using a vacuum chamber. The purpose was to control the vacuum conditions, such as the air pressure and the temperature during the fabrication process. The parameters to be assessed include the layer thickness and the continuous environment. Results presented show high accuracy and repeatability achieving low cost productivity.Keywords: laser scribing, lightscribe DVD, graphene oxide, scanning electron microscopy
Procedia PDF Downloads 1182711 Molecular Approach for the Detection of Lactic Acid Bacteria in the Kenyan Spontaneously Fermented Milk, Mursik
Authors: John Masani Nduko, Joseph Wafula Matofari
Abstract:
Many spontaneously fermented milk products are produced in Kenya, where they are integral to the human diet and play a central role in enhancing food security and income generation via small-scale enterprises. Fermentation enhances product properties such as taste, aroma, shelf-life, safety, texture, and nutritional value. Some of these products have demonstrated therapeutic and probiotic effects although recent reports have linked some to death, biotoxin infections, and esophageal cancer. These products are mostly processed from poor quality raw materials under unhygienic conditions resulting to inconsistent product quality and limited shelf-lives. Though very popular, research on their processing technologies is low, and none of the products has been produced under controlled conditions using starter cultures. To modernize the processing technologies for these products, our study aims at describing the microbiology and biochemistry of a representative Kenyan spontaneously fermented milk product, Mursik using modern biotechnology (DNA sequencing) and their chemical composition. Moreover, co-creation processes reflecting stakeholders’ experiences on traditional fermented milk production technologies and utilization, ideals and senses of value, which will allow the generation of products based on common ground for rapid progress will be discussed. Knowledge of the value of clean starting raw material will be emphasized, the need for the definition of fermentation parameters highlighted, and standard equipment employment to attain controlled fermentation discussed. This presentation will review the available information regarding traditional fermented milk (Mursik) and highlight our current research work on the application of molecular approaches (metagenomics) for the valorization of Mursik production process through starter culture/ probiotic strains isolation and identification, and quality and safety aspects of the product. The importance of the research and future research areas on the same subject will also be highlighted.Keywords: lactic acid bacteria, high throughput biotechnology, spontaneous fermentation, Mursik
Procedia PDF Downloads 2912710 Geological Structure Identification in Semilir Formation: An Correlated Geological and Geophysical (Very Low Frequency) Data for Zonation Disaster with Current Density Parameters and Geological Surface Information
Authors: E. M. Rifqi Wilda Pradana, Bagus Bayu Prabowo, Meida Riski Pujiyati, Efraim Maykhel Hagana Ginting, Virgiawan Arya Hangga Reksa
Abstract:
The VLF (Very Low Frequency) method is an electromagnetic method that uses low frequencies between 10-30 KHz which results in a fairly deep penetration. In this study, the VLF method was used for zonation of disaster-prone areas by identifying geological structures in the form of faults. Data acquisition was carried out in Trimulyo Region, Jetis District, Bantul Regency, Special Region of Yogyakarta, Indonesia with 8 measurement paths. This study uses wave transmitters from Japan and Australia to obtain Tilt and Elipt values that can be used to create RAE (Rapat Arus Ekuivalen or Current Density) sections that can be used to identify areas that are easily crossed by electric current. This section will indicate the existence of a geological structure in the form of faults in the study area which is characterized by a high RAE value. In data processing of VLF method, it is obtained Tilt vs Elliptical graph and Moving Average (MA) Tilt vs Moving Average (MA) Elipt graph of each path that shows a fluctuating pattern and does not show any intersection at all. Data processing uses Matlab software and obtained areas with low RAE values that are 0%-6% which shows medium with low conductivity and high resistivity and can be interpreted as sandstone, claystone, and tuff lithology which is part of the Semilir Formation. Whereas a high RAE value of 10% -16% which shows a medium with high conductivity and low resistivity can be interpreted as a fault zone filled with fluid. The existence of the fault zone is strengthened by the discovery of a normal fault on the surface with strike N550W and dip 630E at coordinates X= 433256 and Y= 9127722 so that the activities of residents in the zone such as housing, mining activities and other activities can be avoided to reduce the risk of natural disasters.Keywords: current density, faults, very low frequency, zonation
Procedia PDF Downloads 1722709 A Single-Channel BSS-Based Method for Structural Health Monitoring of Civil Infrastructure under Environmental Variations
Authors: Yanjie Zhu, André Jesus, Irwanda Laory
Abstract:
Structural Health Monitoring (SHM), involving data acquisition, data interpretation and decision-making system aim to continuously monitor the structural performance of civil infrastructures under various in-service circumstances. The main value and purpose of SHM is identifying damages through data interpretation system. Research on SHM has been expanded in the last decades and a large volume of data is recorded every day owing to the dramatic development in sensor techniques and certain progress in signal processing techniques. However, efficient and reliable data interpretation for damage detection under environmental variations is still a big challenge. Structural damages might be masked because variations in measured data can be the result of environmental variations. This research reports a novel method based on single-channel Blind Signal Separation (BSS), which extracts environmental effects from measured data directly without any prior knowledge of the structure loading and environmental conditions. Despite the successful application in audio processing and bio-medical research fields, BSS has never been used to detect damage under varying environmental conditions. This proposed method optimizes and combines Ensemble Empirical Mode Decomposition (EEMD), Principal Component Analysis (PCA) and Independent Component Analysis (ICA) together to separate structural responses due to different loading conditions respectively from a single channel input signal. The ICA is applying on dimension-reduced output of EEMD. Numerical simulation of a truss bridge, inspired from New Joban Line Arakawa Railway Bridge, is used to validate this method. All results demonstrate that the single-channel BSS-based method can recover temperature effects from mixed structural response recorded by a single sensor with a convincing accuracy. This will be the foundation of further research on direct damage detection under varying environment.Keywords: damage detection, ensemble empirical mode decomposition (EEMD), environmental variations, independent component analysis (ICA), principal component analysis (PCA), structural health monitoring (SHM)
Procedia PDF Downloads 3032708 The Evaluation of Antioxidant and Antimicrobial Activities of Essential Oil and Aqueous, Methanol, Ethanol, Ethyl Acetate and Acetone Extract of Hypericum scabrum
Authors: A. Heshmati, M. Y Alikhani, M. T. Godarzi, M. R. Sadeghimanesh
Abstract:
Herbal essential oil and extracts are a good source of natural antioxidants and antimicrobial compounds. Hypericum is one of the potential sources of these compounds. In this study, the antioxidant and antimicrobial activity of essential oil and aqueous, methanol, ethanol, ethyl acetate and acetone extract of Hypericum scabrum was assessed. Flowers of Hypericum scabrum were collected from the surrounding mountains of Hamadan province and after drying in the shade, the essential oil of the plant was extracted by Clevenger and water, methanol, ethanol, ethyl acetate and acetone extract was obtained by maceration method. Essential oil compounds were identified using the GC-Mass. The Folin-Ciocalteau and aluminum chloride (AlCl3) colorimetric method was used to measure the amount of phenolic acid and flavonoids, respectively. Antioxidant activity was evaluated using DPPH and FRAP. The minimum inhibitory concentration (MIC) and the minimum bacterial/fungicide concentration (MBC/MFC) of essential oil and extracts were evaluated against Staphylococcus aureus, Bacillus cereus, Pseudomonas aeruginosa, Salmonella typhimurium, Aspergillus flavus and Candida albicans. The essential oil yield of was 0.35%, the lowest and highest extract yield was related to ethyl acetate and water extract. The most component of essential oil was α-Pinene (46.35%). The methanol extracts had the highest phenolic acid (95.65 ± 4.72 µg galic acid equivalent/g dry plant) and flavonoids (25.39 ± 2.73 µg quercetin equivalent/g dry plant). The percentage of DPPH radical inhibition showed positive correlation with concentrations of essential oil or extract. The methanol and ethanol extract had the highest DDPH radical inhibitory. Essential oil and extracts of Hypericum had antimicrobial activity against the microorganisms studied in this research. The MIC and MBC values for essential oils were in the range of 25-25.6 and 25-50 μg/mL, respectively. For the extracts, these values were 1.5625-100 and 3.125-100 μg/mL, respectively. Methanol extracts had the highest antimicrobial activity. Essential oil and extract of Hypericum scabrum, especially methanol extract, have proper antimicrobial and antioxidant activity, and it can be used to control the oxidation and inhibit the growth of pathogenic and spoilage microorganisms. In addition, it can be used as a substitute for synthetic antioxidant and antimicrobial compounds.Keywords: antimicrobial, antioxidant, extract, hypericum
Procedia PDF Downloads 3272707 Automatic Furrow Detection for Precision Agriculture
Authors: Manpreet Kaur, Cheol-Hong Min
Abstract:
The increasing advancement in the robotics equipped with machine vision sensors applied to precision agriculture is a demanding solution for various problems in the agricultural farms. An important issue related with the machine vision system concerns crop row and weed detection. This paper proposes an automatic furrow detection system based on real-time processing for identifying crop rows in maize fields in the presence of weed. This vision system is designed to be installed on the farming vehicles, that is, submitted to gyros, vibration and other undesired movements. The images are captured under image perspective, being affected by above undesired effects. The goal is to identify crop rows for vehicle navigation which includes weed removal, where weeds are identified as plants outside the crop rows. The images quality is affected by different lighting conditions and gaps along the crop rows due to lack of germination and wrong plantation. The proposed image processing method consists of four different processes. First, image segmentation based on HSV (Hue, Saturation, Value) decision tree. The proposed algorithm used HSV color space to discriminate crops, weeds and soil. The region of interest is defined by filtering each of the HSV channels between maximum and minimum threshold values. Then the noises in the images were eliminated by the means of hybrid median filter. Further, mathematical morphological processes, i.e., erosion to remove smaller objects followed by dilation to gradually enlarge the boundaries of regions of foreground pixels was applied. It enhances the image contrast. To accurately detect the position of crop rows, the region of interest is defined by creating a binary mask. The edge detection and Hough transform were applied to detect lines represented in polar coordinates and furrow directions as accumulations on the angle axis in the Hough space. The experimental results show that the method is effective.Keywords: furrow detection, morphological, HSV, Hough transform
Procedia PDF Downloads 2292706 Genetic Data of Deceased People: Solving the Gordian Knot
Authors: Inigo de Miguel Beriain
Abstract:
Genetic data of deceased persons are of great interest for both biomedical research and clinical use. This is due to several reasons. On the one hand, many of our diseases have a genetic component; on the other hand, we share genes with a good part of our biological family. Therefore, it would be possible to improve our response considerably to these pathologies if we could use these data. Unfortunately, at the present moment, the status of data on the deceased is far from being satisfactorily resolved by the EU data protection regulation. Indeed, the General Data Protection Regulation has explicitly excluded these data from the category of personal data. This decision has given rise to a fragmented legal framework on this issue. Consequently, each EU member state offers very different solutions. For instance, Denmark considers the data as personal data of the deceased person for a set period of time while some others, such as Spain, do not consider this data as such, but have introduced some specifically focused regulations on this type of data and their access by relatives. This is an extremely dysfunctional scenario from multiple angles, not least of which is scientific cooperation at the EU level. This contribution attempts to outline a solution to this dilemma through an alternative proposal. Its main hypothesis is that, in reality, health data are, in a sense, a rara avis within data in general because they do not refer to one person but to several. Hence, it is possible to think that all of them can be considered data subjects (although not all of them can exercise the corresponding rights in the same way). When the person from whom the data were obtained dies, the data remain as personal data of his or her biological relatives. Hence, the general regime provided for in the GDPR may apply to them. As these are personal data, we could go back to thinking in terms of a general prohibition of data processing, with the exceptions provided for in Article 9.2 and on the legal bases included in Article 6. This may be complicated in practice, given that, since we are dealing with data that refer to several data subjects, it may be complex to refer to some of these bases, such as consent. Furthermore, there are theoretical arguments that may oppose this hypothesis. In this contribution, it is shown, however, that none of these objections is of sufficient substance to delegitimize the argument exposed. Therefore, the conclusion of this contribution is that we can indeed build a general framework on the processing of personal data of deceased persons in the context of the GDPR. This would constitute a considerable improvement over the current regulatory framework, although it is true that some clarifications will be necessary for its practical application.Keywords: collective data conceptual issues, data from deceased people, genetic data protection issues, GDPR and deceased people
Procedia PDF Downloads 1542705 GPU Based Real-Time Floating Object Detection System
Authors: Jie Yang, Jian-Min Meng
Abstract:
A GPU-based floating object detection scheme is presented in this paper which is designed for floating mine detection tasks. This system uses contrast and motion information to eliminate as many false positives as possible while avoiding false negatives. The GPU computation platform is deployed to allow detecting objects in real-time. From the experimental results, it is shown that with certain configuration, the GPU-based scheme can speed up the computation up to one thousand times compared to the CPU-based scheme.Keywords: object detection, GPU, motion estimation, parallel processing
Procedia PDF Downloads 4722704 A Literature Review of Precision Agriculture: Applications of Diagnostic Diseases in Corn, Potato, and Rice Based on Artificial Intelligence
Authors: Carolina Zambrana, Grover Zurita
Abstract:
The food loss production that occurs in deficient agricultural production is one of the major problems worldwide. This puts the population's food security and the efficiency of farming investments at risk. It is to be expected that this food security will be achieved with the own and efficient production of each country. It will have an impact on the well-being of its population and, thus, also on food sovereignty. The production losses in quantity and quality occur due to the lack of efficient detection of diseases at an early stage. It is very difficult to solve the agriculture efficiency using traditional methods since it takes a long time to be carried out due to detection imprecision of the main diseases, especially when the production areas are extensive. Therefore, the main objective of this research study is to perform a systematic literature review, of the latest five years, of Precision Agriculture (PA) to be able to understand the state of the art of the set of new technologies, procedures, and optimization processes with Artificial Intelligence (AI). This study will focus on Corns, Potatoes, and Rice diagnostic diseases. The extensive literature review will be performed on Elsevier, Scopus, and IEEE databases. In addition, this research will focus on advanced digital imaging processing and the development of software and hardware for PA. The convolution neural network will be handling special attention due to its outstanding diagnostic results. Moreover, the studied data will be incorporated with artificial intelligence algorithms for the automatic diagnosis of crop quality. Finally, precision agriculture with technology applied to the agricultural sector allows the land to be exploited efficiently. This system requires sensors, drones, data acquisition cards, and global positioning systems. This research seeks to merge different areas of science, control engineering, electronics, digital image processing, and artificial intelligence for the development, in the near future, of a low-cost image measurement system that allows the optimization of crops with AI.Keywords: precision agriculture, convolutional neural network, deep learning, artificial intelligence
Procedia PDF Downloads 772703 Application of Geotube® Method for Sludge Handling in Adaro Coal Mine
Authors: Ezman Fitriansyah, Lestari Diah Restu, Wawan
Abstract:
Adaro coal mine in South Kalimantan-Indonesia maintains catchment area of approximately 15,000 Ha for its mine operation. As an open pit surface coal mine with high erosion rate, the mine water in Adaro coal mine contains high TSS that needs to be treated before being released to rivers. For the treatment process, Adaro operates 21 Settling Ponds equipped with combination of physical and chemical system to separate solids and water to ensure the discharged water complied with regional environmental quality standards. However, the sludge created from the sedimentation process reduces the settling ponds capacity gradually. Therefore regular maintenance activities are required to recover and maintain the ponds' capacity. Trucking system and direct dredging had been the most common method to handle sludge in Adaro. But the main problem in applying these two methods is excessive area required for drying pond construction. To solve this problem, Adaro implements an alternative method called Geotube®. The principle of Geotube® method is the sludge contained in the Settling Ponds is pumped into Geotube® containers which have been designed to release water and retain mud flocks. During the pumping process, an amount of flocculants chemicals are injected into the sludge to form bigger mud flocks. Due to the difference in particle size, the mud flocks are settled in the container whilst the water continues to flow out through the container’s pores. Compared to the trucking system and direct dredging method, this method provides three advantages: space required to operate, increasing of overburden waste dump volume, and increasing of water treatment process speed and quality. Based on the evaluation result, Geotube® method only needs 1:8 of space required by the other methods. From the geotechnical assessment result conducted by Adaro, the potential loss of waste dump volume capacity prior to implementation of the Geotube® method was 26.7%. The water treatment process of TSS in well maintained ponds is 16% more optimum.Keywords: geotube, mine water, settling pond, sludge handling, wastewater treatment
Procedia PDF Downloads 1982702 Enhanced Kinetic Solubility Profile of Epiisopiloturine Solid Solution in Hipromellose Phthalate
Authors: Amanda C. Q. M. Vieira, Cybelly M. Melo, Camila B. M. Figueirêdo, Giovanna C. R. M. Schver, Salvana P. M. Costa, Magaly A. M. de Lyra, Ping I. Lee, José L. Soares-Sobrinho, Pedro J. Rolim-Neto, Mônica F. R. Soares
Abstract:
Epiisopiloturine (EPI) is a drug candidate that is extracted from Pilocarpus microphyllus and isolated from the waste of Pilocarpine. EPI has demonstrated promising schistosomicidal, leishmanicide, anti-inflammatory and antinociceptive activities, according to in vitro studies that have been carried out since 2009. However, this molecule shows poor aqueous solubility, which represents a problem for the release of the drug candidate and its absorption by the organism. The purpose of the present study is to investigate the extent of enhancement of kinetic solubility of a solid solution (SS) of EPI in hipromellose phthalate HP-55 (HPMCP), an enteric polymer carrier. SS was obtained by the solvent evaporation methodology, using acetone/methanol (60:40) as solvent system. Both EPI and polymer (drug loading 10%) were dissolved in this solvent until a clear solution was obtained, and then dried in oven at 60ºC during 12 hours, followed by drying in a vacuum oven for 4 h. The results show a considerable modification in the crystalline structure of the drug candidate. For instance, X-ray diffraction (XRD) shows a crystalline behavior for the EPI, which becomes amorphous for the SS. Polarized light microscopy, a more sensitive technique than XRD, also shows completely absence of crystals in SS sample. Differential Scanning Calorimetric (DSC) curves show no signal of EPI melting point in SS curve, indicating, once more, no presence of crystal in this system. Interaction between the drug candidate and the polymer were found in Infrared microscopy, which shows a carbonyl 43.3 cm-1 band shift, indicating a moderate-strong interaction between them, probably one of the reasons to the SS formation. Under sink conditions (pH 6.8), EPI SS had its dissolution performance increased in 2.8 times when compared with the isolated drug candidate. EPI SS sample provided a release of more than 95% of the drug candidate in 15 min, whereas only 45% of EPI (alone) could be dissolved in 15 min and 70% in 90 min. Thus, HPMCP demonstrates to have a good potential to enhance the kinetic solubility profile of EPI. Future studies to evaluate the stability of SS are required to conclude the benefits of this system.Keywords: epiisopiloturine, hipromellose phthalate HP-55, pharmaceuticaltechnology, solubility
Procedia PDF Downloads 6052701 Achieving Flow at Work: An Experience Sampling Study to Comprehend How Cognitive Task Characteristics and Work Environments Predict Flow Experiences
Authors: Jonas De Kerf, Rein De Cooman, Sara De Gieter
Abstract:
For many decades, scholars have aimed to understand how work can become more meaningful by maximizing both potential and enhancing feelings of satisfaction. One of the largest contributions towards such positive psychology was made with the introduction of the concept of ‘flow,’ which refers to a condition in which people feel intense engagement and effortless action. Since then, valuable research on work-related flow has indicated that this state of mind is related to positive outcomes for both organizations (e.g., social, supportive climates) and workers (e.g., job satisfaction). Yet, scholars still do not fully comprehend how such deep involvement at work is obtained, given the notion that flow is considered a short-term, complex, and dynamic experience. Most research neglects that people who experience flow ought to be optimally challenged so that intense concentration is required. Because attention is at the core of this enjoyable state of mind, this study aims to comprehend how elements that affect workers’ cognitive functioning impact flow at work. Research on cognitive performance suggests that working on mentally demanding tasks (e.g., information processing tasks) requires workers to concentrate deeply, as a result leading to flow experiences. Based on social facilitation theory, working on such tasks in an isolated environment eases concentration. Prior research has indicated that working at home (instead of working at the office) or in a closed office (rather than in an open-plan office) impacts employees’ overall functioning in terms of concentration and productivity. Consequently, we advance such knowledge and propose an interaction by combining cognitive task characteristics and work environments among part-time teleworkers. Hence, we not only aim to shed light on the relation between cognitive tasks and flow but also provide empirical evidence that workers performing such tasks achieve the highest states of flow while working either at home or in closed offices. In July 2022, an experience-sampling study will be conducted that uses a semi-random signal schedule to understand how task and environment predictors together impact part-time teleworkers’ flow. More precisely, about 150 knowledge workers will fill in multiple surveys a day for two consecutive workweeks to report their flow experiences, cognitive tasks, and work environments. Preliminary results from a pilot study indicate that on a between level, tasks high in information processing go along with high self-reported fluent productivity (i.e., making progress). As expected, evidence was found for higher fluency in productivity for workers performing information processing tasks both at home and in a closed office, compared to those performing the same tasks at the office or in open-plan offices. This study expands the current knowledge on work-related flow by looking at a task and environmental predictors that enable workers to obtain such a peak state. While doing so, our findings suggest that practitioners should strive for ideal alignments between tasks and work locations to work with both deep involvement and gratification.Keywords: cognitive work, office lay-out, work location, work-related flow
Procedia PDF Downloads 1002700 Toward Subtle Change Detection and Quantification in Magnetic Resonance Neuroimaging
Authors: Mohammad Esmaeilpour
Abstract:
One of the important open problems in the field of medical image processing is detection and quantification of small changes. In this poster, we try to investigate that, how the algebraic decomposition techniques can be used for semiautomatically detecting and quantifying subtle changes in Magnetic Resonance (MR) neuroimaging volumes. We mostly focus on the low-rank values of the matrices achieved from decomposing MR image pairs during a period of time. Besides, a skillful neuroradiologist will help the algorithm to distinguish between noises and small changes.Keywords: magnetic resonance neuroimaging, subtle change detection and quantification, algebraic decomposition, basis functions
Procedia PDF Downloads 4722699 Depolymerised Natural Polysaccharides Enhance the Production of Medicinal and Aromatic Plants and Their Active Constituents
Authors: M. Masroor Akhtar Khan, Moin Uddin, Lalit Varshney
Abstract:
Recently, there has been a rapidly expanding interest in finding applications of natural polymers in view of value addition to agriculture. It is now being realized that radiation processing of natural polysaccharides can be beneficially utilized either to improve the existing methodologies used for processing the natural polymers or to impart value addition to agriculture by converting them into more useful form. Gamma-ray irradiation is employed to degrade and lower the molecular weight of some of the natural polysaccharides like alginates, chitosan and carrageenan into small sized oligomers. When these oligomers are applied to plants as foliar sprays, they elicit various kinds of biological and physiological activities, including promotion of plant growth, seed germination, shoot elongation, root growth, flower production, suppression of heavy metal stress, etc. Furthermore, application of these oligomers can shorten the harvesting period of various crops and help in reducing the use of insecticides and chemical fertilizers. In recent years, the oligomers of sodium alginate obtained by irradiating the latter with gamma-rays at 520 kGy dose are being employed. It was noticed that the oligomers derived from the natural polysaccharides could induce growth, photosynthetic efficiency, enzyme activities and most importantly the production of secondary metabolite in the plants like Artemisia annua, Beta vulgaris, Catharanthus roseus, Chrysopogon zizanioides, Cymbopogon flexuosus, Eucalyptus citriodora, Foeniculum vulgare, Geranium sp., Mentha arvensis, Mentha citrata, Mentha piperita, Mentha virdis, Papaver somniferum and Trigonella foenum-graecum. As a result of the application of these oligomers, the yield and/or contents of the active constituents of the aforesaid plants were significantly enhanced. The productivity, as well as quality of medicinal and aromatic plants, may be ameliorated by this novel technique in an economical way as a very little quantity of these irradiated (depolymerised) polysaccharides is needed. Further, this is a very safe technique, as we did not expose the plants directly to radiation. The radiation was used to depolymerize the polysaccharides into oligomers.Keywords: essential oil, medicinal and aromatic plants, plant production, radiation processed polysaccharides, active constituents
Procedia PDF Downloads 4422698 Fabric Softener Deposition on Cellulose Nanocrystals and Cotton Fibers
Authors: Evdokia K. Oikonomou, Nikolay Christov, Galder Cristobal, Graziana Messina, Giovani Marletta, Laurent Heux, Jean-Francois Berret
Abstract:
Fabric softeners are aqueous formulations that contain ~10 wt. % double tailed cationic surfactants. Here, a formulation in which 50% surfactant was replaced with low quantities of natural guar polymers was developed. Thanks to the reduced surfactant quantity this product has less environmental impact while the guars presence was found to maintain the product’s performance. The objective of this work is to elucidate the effect of the guar polymers on the softener deposition and the adsorption mechanism on the cotton surface. The surfactants in these formulations are assembled into large distributed (0.1 – 1 µm) vesicles that are stable in the presence of guars and upon dilution. The effect of guars on the vesicles adsorption on cotton was first estimated by using cellulose nanocrystals (CNC) as a stand-in for cotton. The dispersion of CNC in water permits to follow the interaction between the vesicles, guars, and CNC in the bulk. It was found that guars enhance the deposition on CNC and that the vesicles are deposited intactly on the fibers driven by electrostatics. The mechanism of the vesicles/guars adsorption on cellulose fibers was identified by quartz crystal microbalance with dissipation monitoring. It was found that the guars increase the surfactant deposited quantity, in agreement with the results in the bulk. Also, the structure of the adsorbed surfactant on the fibers' surfaces (vesicle or bilayer) was influenced by the guars presence. Deposition studies on cotton fabrics were also conducted. Attenuated total reflection and scanning electron microscopy were used to study the effect of the polymers on this deposition. Finally, fluorescent microscopy was used to follow the adsorption of surfactant vesicles, labeled with a fluorescent dye, on cotton fabrics in water. It was found that, in the presence or not of polymers, the surfactant vesicles are adsorbed on fiber maintaining their vesicular structure in water (supported vesicular bilayer structure). The guars influence this process. However, upon drying the vesicles are transformed into bilayers and eventually wrap the fibers (supported lipid bilayer structure). This mechanism is proposed for the adsorption of vesicular conditioner on cotton fiber and can be affected by the presence of polymers.Keywords: cellulose nanocrystals, cotton fibers, fabric softeners, guar polymers, surfactant vesicles
Procedia PDF Downloads 1782697 Study of the Design and Simulation Work for an Artificial Heart
Authors: Mohammed Eltayeb Salih Elamin
Abstract:
This study discusses the concept of the artificial heart using engineering concepts, of the fluid mechanics and the characteristics of the non-Newtonian fluid. For the purpose to serve heart patients and improve aspects of their lives and since the Statistics review according to world health organization (WHO) says that heart disease and blood vessels are the first cause of death in the world. Statistics shows that 30% of the death cases in the world by the heart disease, so simply we can consider it as the number one leading cause of death in the entire world is heart failure. And since the heart implantation become a very difficult and not always available, the idea of the artificial heart become very essential. So it’s important that we participate in the developing this idea by searching and finding the weakness point in the earlier designs and hoping for improving it for the best of humanity. In this study a pump was designed in order to pump blood to the human body and taking into account all the factors that allows it to replace the human heart, in order to work at the same characteristics and the efficiency of the human heart. The pump was designed on the idea of the diaphragm pump. Three models of blood obtained from the blood real characteristics and all of these models were simulated in order to study the effect of the pumping work on the fluid. After that, we study the properties of this pump by using Ansys15 software to simulate blood flow inside the pump and the amount of stress that it will go under. The 3D geometries modeling was done using SOLID WORKS and the geometries then imported to Ansys design modeler which is used during the pre-processing procedure. The solver used throughout the study is Ansys FLUENT. This is a tool used to analysis the fluid flow troubles and the general well-known term used for this branch of science is known as Computational Fluid Dynamics (CFD). Basically, Design Modeler used during the pre-processing procedure which is a crucial step before the start of the fluid flow problem. Some of the key operations are the geometry creations which specify the domain of the fluid flow problem. Next is mesh generation which means discretization of the domain to solve governing equations at each cell and later, specify the boundary zones to apply boundary conditions for the problem. Finally, the pre–processed work will be saved at the Ansys workbench for future work continuation.Keywords: Artificial heart, computational fluid dynamic heart chamber, design, pump
Procedia PDF Downloads 4582696 Problems of Boolean Reasoning Based Biclustering Parallelization
Authors: Marcin Michalak
Abstract:
Biclustering is the way of two-dimensional data analysis. For several years it became possible to express such issue in terms of Boolean reasoning, for processing continuous, discrete and binary data. The mathematical backgrounds of such approach — proved ability of induction of exact and inclusion–maximal biclusters fulfilling assumed criteria — are strong advantages of the method. Unfortunately, the core of the method has quite high computational complexity. In the paper the basics of Boolean reasoning approach for biclustering are presented. In such context the problems of computation parallelization are risen.Keywords: Boolean reasoning, biclustering, parallelization, prime implicant
Procedia PDF Downloads 1222695 Studies on the Effect of Dehydration Techniques, Treatments, Packaging Material and Methods on the Quality of Buffalo Meat during Ambient Temperature Storage
Authors: Tariq Ahmad Safapuri, Saghir Ahmad, Farhana Allai
Abstract:
The present study was conducted to evaluate the effect dehydration techniques (polyhouse and tray drying), different treatment (SHMP, SHMP+ salt, salt + turmeric), different packaging material (HDPE, combination film), and different packaging methods (air, vacuum, CO2 Flush) on quality of dehydrated buffalo meat during ambient temperature storage. The quality measuring parameters included physico-chemical characteristics i.e. pH, rehydration ratio, moisture content and microbiological characteristics viz total plate content. It was found that the treatment of (SHMP, SHMP + salt, salt + turmeric increased the pH. Moisture Content of dehydrated meat samples were found in between 7.20% and 5.54%.the rehydration ratio of salt+ turmeric treated sample was found to be highest and lowest for controlled meat sample. the bacterial count log TPC/g of salt + turmeric and tray dried was lowest i.e. 1.80.During ambient temperature storage ,there was no considerable change in pH of dehydrated sample till 150 days. however the moisture content of samples increased in different packaging system in different manner. The highest moisture rise was found in case of controlled meat sample HDPE/air packed while the lowest increase was reported for SHMP+ Salt treated Packed by vacuum in combination film packed sample. Rehydration ratio was found considerably affected in case of HDPE and air packed sample dehydrated in polyhouse after 150 days of ambient storage. While there was a very little change in the rehydration ratio of meat samples packed in combination film CO2 flush system. The TPC was found under safe limit even after 150 days of storage. The microbial count was found to be lowest for salt+ turmeric treated samples after 150 days of storage.Keywords: ambient temperature, dehydration technique, rehydration ratio, SHMP (sodium hexa meta phosphate), HDPE (high density polyethelene)
Procedia PDF Downloads 4152694 Ischemic Stroke Detection in Computed Tomography Examinations
Authors: Allan F. F. Alves, Fernando A. Bacchim Neto, Guilherme Giacomini, Marcela de Oliveira, Ana L. M. Pavan, Maria E. D. Rosa, Diana R. Pina
Abstract:
Stroke is a worldwide concern, only in Brazil it accounts for 10% of all registered deaths. There are 2 stroke types, ischemic (87%) and hemorrhagic (13%). Early diagnosis is essential to avoid irreversible cerebral damage. Non-enhanced computed tomography (NECT) is one of the main diagnostic techniques used due to its wide availability and rapid diagnosis. Detection depends on the size and severity of lesions and the time spent between the first symptoms and examination. The Alberta Stroke Program Early CT Score (ASPECTS) is a subjective method that increases the detection rate. The aim of this work was to implement an image segmentation system to enhance ischemic stroke and to quantify the area of ischemic and hemorrhagic stroke lesions in CT scans. We evaluated 10 patients with NECT examinations diagnosed with ischemic stroke. Analyzes were performed in two axial slices, one at the level of the thalamus and basal ganglion and one adjacent to the top edge of the ganglionic structures with window width between 80 and 100 Hounsfield Units. We used different image processing techniques such as morphological filters, discrete wavelet transform and Fuzzy C-means clustering. Subjective analyzes were performed by a neuroradiologist according to the ASPECTS scale to quantify ischemic areas in the middle cerebral artery region. These subjective analysis results were compared with objective analyzes performed by the computational algorithm. Preliminary results indicate that the morphological filters actually improve the ischemic areas for subjective evaluations. The comparison in area of the ischemic region contoured by the neuroradiologist and the defined area by computational algorithm showed no deviations greater than 12% in any of the 10 examination tests. Although there is a tendency that the areas contoured by the neuroradiologist are smaller than those obtained by the algorithm. These results show the importance of a computer aided diagnosis software to assist neuroradiology decisions, especially in critical situations as the choice of treatment for ischemic stroke.Keywords: ischemic stroke, image processing, CT scans, Fuzzy C-means
Procedia PDF Downloads 3662693 Processing Design of Miniature Casting Incorporating Stereolithography Technologies
Authors: Pei-Hsing Huang, Wei-Ju Huang
Abstract:
Investment casting is commonly used in the production of metallic components with complex shapes, due to its high dimensional precision, good surface finish, and low cost. However, the process is cumbersome, and the period between trial casting and final production can be very long, thereby limiting business opportunities and competitiveness. In this study, we replaced conventional wax injection with stereolithography (SLA) 3D printing to speed up the trial process and reduce costs. We also used silicone molds to further reduce costs to avoid the high costs imposed by photosensitive resin.Keywords: investment casting, stereolithography, wax molding, 3D printing
Procedia PDF Downloads 4022692 Raising the Property Provisions of the Topographic Located near the Locality of Gircov, Romania
Authors: Carmen Georgeta Dumitrache
Abstract:
Measurements of terrestrial science aims to study the totality of operations and computing, which are carried out for the purposes of representation on the plan or map of the land surface in a specific cartographic projection and topographic scale. With the development of society, the metrics have evolved, and they land, being dependent on the achievement of a goal-bound utility of economic activity and of a scientific purpose related to determining the form and dimensions of the Earth. For measurements in the field, data processing and proper representation on drawings and maps of planimetry and landform of the land, using topographic and geodesic instruments, calculation and graphical reporting, which requires a knowledge of theoretical and practical concepts from different areas of science and technology. In order to use properly in practice, topographical and geodetic instruments designed to measure precise angles and distances are required knowledge of geometric optics, precision mechanics, the strength of materials, and more. For processing, the results from field measurements are necessary for calculation methods, based on notions of geometry, trigonometry, algebra, mathematical analysis and computer science. To be able to illustrate topographic measurements was established for the lifting of property located near the locality of Gircov, Romania. We determine this total surface of the plan (T30), parcel/plot, but also in the field trace the coordinates of a parcel. The purpose of the removal of the planimetric consisted of: the exact determination of the bounding surface; analytical calculation of the surface; comparing the surface determined with the one registered in the documents produced; drawing up a plan of location and delineation with closeness and distance contour, as well as highlighting the parcels comprising this property; drawing up a plan of location and delineation with closeness and distance contour for a parcel from Dave; in the field trace outline of plot points from the previous point. The ultimate goal of this work was to determine and represent the surface, but also to tear off a plot of the surface total, while respecting the first surface condition imposed by the Act of the beneficiary's property.Keywords: topography, surface, coordinate, modeling
Procedia PDF Downloads 255