Search results for: Neural Processing Element (NPE)
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7894

Search results for: Neural Processing Element (NPE)

1414 Biochemical Characteristics and Microstructure of Ice Cream Prepared from Fresh Cream

Authors: S. Baississe, S. Godbane, A. Lekbir

Abstract:

The objective of our work is to develop an ice cream from a fermented cream, skim milk and other ingredients and follow the evolution of its physicochemical properties, biochemical and microstructure of the products obtained. Our cream is aerated with the manufacturing steps start with a homogenizing follow different ingredients by heating to 40°C emulsion, the preparation is then subjected to a heat treatment at 65°C for 30 min, before being stored in the cold at 4°C for a few hours. This conservation promotes crystallization of the material during the globular stage of maturation of the cream. The emulsifying agent moves gradually absorbed on the surface of fat globules homogeneous, which results in reduced protein stability. During the expansion, the collusion of destabilizing fat globules in the aqueous phase favours their coalescence. During the expansion, the collusion of destabilized fat globules in the aqueous phase favours their coalescence. The stabilizing agent increases the viscosity of the aqueous phase and the drainage limit interaction with the proteins of the aqueous phase and the protein absorbed on fat globules. The cutting improved organoleptic property of our cream is made by the use of three dyes and aromas. The products obtained undergo physicochemical analyses (pH, conductivity and acidity), biochemical (moisture, % dry matter and fat in %), and finally in the microscopic observation of the microstructure and the results obtained by analysis of the image processing software. The results show a remarkable evolution of physicochemical properties (pH, conductivity and acidity), biochemical (moisture, fat and non-fat) and microstructure of the products developed in relation to the raw material (skim milk) and the intermediate product (fermented cream).

Keywords: ice cream, sour cream, physicochemical, biochemical, microstructure

Procedia PDF Downloads 192
1413 Designing Offshore Pipelines Facing the Geohazard of Active Seismic Faults

Authors: Maria Trimintziou, Michael Sakellariou, Prodromos Psarropoulos

Abstract:

Nowadays, the exploitation of hydrocarbons reserves in deep seas and oceans, in combination with the need to transport hydrocarbons among countries, has made the design, construction and operation of offshore pipelines very significant. Under this perspective, it is evident that many more offshore pipelines are expected to be constructed in the near future. Since offshore pipelines are usually crossing extended areas, they may face a variety of geohazards that impose substantial permanent ground deformations (PGDs) to the pipeline and potentially threaten its integrity. In case of a geohazard area, there exist three options to proceed. The first option is to avoid the problematic area through rerouting, which is usually regarded as an unfavorable solution due to its high cost. The second is to apply (if possible) mitigation/protection measures in order to eliminate the geohazard itself. Finally, the last appealing option is to allow the pipeline crossing through the geohazard area, provided that the pipeline will have been verified against the expected PGDs. In areas with moderate or high seismicity the design of an offshore pipeline is more demanding due to the earthquake-related geohazards, such as landslides, soil liquefaction phenomena, and active faults. It is worthy to mention that although worldwide there is a great experience in offshore geotechnics and pipeline design, the experience in seismic design of offshore pipelines is rather limited due to the fact that most of the pipelines have been constructed in non-seismic regions (e.g. North Sea, West Australia, Gulf of Mexico, etc.). The current study focuses on the seismic design of offshore pipelines against active faults. After an extensive literature review of the provisions of the seismic norms worldwide and of the available analytical methods, the study simulates numerically (through finite-element modeling and strain-based criteria) the distress of offshore pipelines subjected to PGDs induced by active seismic faults at the seabed. Factors, such as the geometrical properties of the fault, the mechanical properties of the ruptured soil formations, and the pipeline characteristics, are examined. After some interesting conclusions regarding the seismic vulnerability of offshore pipelines, potential cost-effective mitigation measures are proposed taking into account constructability issues.

Keywords: offhore pipelines, seismic design, active faults, permanent ground deformations (PGDs)

Procedia PDF Downloads 567
1412 Heuristic Spatial-Spectral Hyperspectral Image Segmentation Using Bands Quartile Box Plot Profiles

Authors: Mohamed A. Almoghalis, Osman M. Hegazy, Ibrahim F. Imam, Ali H. Elbastawessy

Abstract:

This paper presents a new hyperspectral image segmentation scheme with respect to both spatial and spectral contexts. The scheme uses the 8-pixels spatial pattern to build a weight structure that holds the number of outlier bands for each pixel among its neighborhood windows in different directions. The number of outlier bands for a pixel is obtained using bands quartile box plots profile among spatial 8-pixels pattern windows. The quartile box plot weight structure represents the spatial-spectral context in the image. Instead of starting segmentation process by single pixels, the proposed methodology starts by pixels groups that proved to share the same spectral features with respect to their spatial context. As a result, the segmentation scheme starts with Jigsaw pieces that build a mosaic image. The following step builds a model for each Jigsaw piece in the mosaic image. Each Jigsaw piece will be merged with another Jigsaw piece using KNN applied to their bands' quartile box plots profiles. The scheme iterates till required number of segments reached. Experiments use two data sets obtained from Earth Observer 1 (EO-1) sensor for Egypt and France. Initial results qualitative analysis showed encouraging results compared with ground truth. Quantitative analysis for the results will be included in the final paper.

Keywords: hyperspectral image segmentation, image processing, remote sensing, box plot

Procedia PDF Downloads 588
1411 Automatic Differential Diagnosis of Melanocytic Skin Tumours Using Ultrasound and Spectrophotometric Data

Authors: Kristina Sakalauskiene, Renaldas Raisutis, Gintare Linkeviciute, Skaidra Valiukeviciene

Abstract:

Cutaneous melanoma is a melanocytic skin tumour, which has a very poor prognosis while is highly resistant to treatment and tends to metastasize. Thickness of melanoma is one of the most important biomarker for stage of disease, prognosis and surgery planning. In this study, we hypothesized that the automatic analysis of spectrophotometric images and high-frequency ultrasonic 2D data can improve differential diagnosis of cutaneous melanoma and provide additional information about tumour penetration depth. This paper presents the novel complex automatic system for non-invasive melanocytic skin tumour differential diagnosis and penetration depth evaluation. The system is composed of region of interest segmentation in spectrophotometric images and high-frequency ultrasound data, quantitative parameter evaluation, informative feature extraction and classification with linear regression classifier. The segmentation of melanocytic skin tumour region in ultrasound image is based on parametric integrated backscattering coefficient calculation. The segmentation of optical image is based on Otsu thresholding. In total 29 quantitative tissue characterization parameters were evaluated by using ultrasound data (11 acoustical, 4 shape and 15 textural parameters) and 55 quantitative features of dermatoscopic and spectrophotometric images (using total melanin, dermal melanin, blood and collagen SIAgraphs acquired using spectrophotometric imaging device SIAscope). In total 102 melanocytic skin lesions (including 43 cutaneous melanomas) were examined by using SIAscope and ultrasound system with 22 MHz center frequency single element transducer. The diagnosis and Breslow thickness (pT) of each MST were evaluated during routine histological examination after excision and used as a reference. The results of this study have shown that automatic analysis of spectrophotometric and high frequency ultrasound data can improve non-invasive classification accuracy of early-stage cutaneous melanoma and provide supplementary information about tumour penetration depth.

Keywords: cutaneous melanoma, differential diagnosis, high-frequency ultrasound, melanocytic skin tumours, spectrophotometric imaging

Procedia PDF Downloads 257
1410 Microscale observations of a gas cell wall rupture in bread dough during baking and confrontation to 2/3D Finite Element simulations of stress concentration

Authors: Kossigan Bernard Dedey, David Grenier, Tiphaine Lucas

Abstract:

Bread dough is often described as a dispersion of gas cells in a continuous gluten/starch matrix. The final bread crumb structure is strongly related to gas cell walls (GCWs) rupture during baking. At the end of proofing and during baking, part of the thinnest GCWs between expanding gas cells is reduced to a gluten film of about the size of a starch granule. When such size is reached gluten and starch granules must be considered as interacting phases in order to account for heterogeneities and appropriately describe GCW rupture. Among experimental investigations carried out to assess GCW rupture, no experimental work was performed to observe the GCW rupture in the baking conditions at GCW scale. In addition, attempts to numerically understand GCW rupture are usually not performed at the GCW scale and often considered GCWs as continuous. The most relevant paper that accounted for heterogeneities dealt with the gluten/starch interactions and their impact on the mechanical behavior of dough film. However, stress concentration in GCW was not discussed. In this study, both experimental and numerical approaches were used to better understand GCW rupture in bread dough during baking. Experimentally, a macro-scope placed in front of a two-chamber device was used to observe the rupture of a real GCW of 200 micrometers in thickness. Special attention was paid in order to mimic baking conditions as far as possible (temperature, gas pressure and moisture). Various differences in pressure between both sides of GCW were applied and different modes of fracture initiation and propagation in GCWs were observed. Numerically, the impact of gluten/starch interactions (cohesion or non-cohesion) and rheological moduli ratio on the mechanical behavior of GCW under unidirectional extension was assessed in 2D/3D. A non-linear viscoelastic and hyperelastic approach was performed to match the finite strain involved in GCW during baking. Stress concentration within GCW was identified. Simulated stresses concentration was discussed at the light of GCW failure observed in the device. The gluten/starch granule interactions and rheological modulus ratio were found to have a great effect on the amount of stress possibly reached in the GCW.

Keywords: dough, experimental, numerical, rupture

Procedia PDF Downloads 109
1409 Customer Satisfaction with Artificial Intelligence-Based Service in Catering Industry: Empirical Study on Smart Kiosks

Authors: Mai Anh Tuan, Wenlong Liu, Meng Li

Abstract:

Despite warnings and concerns about the use of fast food that has health effects, the fast-food industry is actually a source of profit for the global food industry. Obviously, in the face of such huge economic benefits, investors will not hesitate to continuously add recipes, processing methods, menu diversity, etc., to improve and apply information technology in enhancing the diners' experience; the ultimate goal is still to attract diners to find their brand and give them the fastest, most convenient and enjoyable service. In China, as the achievements of the industrial revolution 4.0, big data and artificial intelligence are reaching new heights day by day, now fast-food diners can instantly pay the bills only by identifying the biometric signature available on the self-ordering kiosk, using their own face without any additional form of confirmation. In this study, the author will evaluate the acceptance level of customers with this new form of payment through a survey of customers who have used and witnessed the use of smart kiosks and biometric payments within the city of Nanjing, China. A total of 200 valid volunteers were collected in order to test the customers' intentions and feelings when choosing and experiencing payment through AI services. 55% think that it bothers them because of the need for personal information, but more than 70% think that smart kiosk brings out many benefits and convenience. According to the data analysis findings, perceived innovativeness has a positive influence on satisfaction which in turn affects behavioral intentions, including reuse and word-of-mouth intentions.

Keywords: artificial intelligence, catering industry, smart kiosks, technology acceptance

Procedia PDF Downloads 84
1408 Natural Antioxidant Changes in Fresh and Dried Spices and Vegetables

Authors: Liga Priecina, Daina Karklina

Abstract:

Antioxidants are became the most analyzed substances in last decades. Antioxidants act as in activator for free radicals. Spices and vegetables are one of major antioxidant sources. Most common antioxidants in vegetables and spices are vitamin C, E, phenolic compounds, carotenoids. Therefore, it is important to get some view about antioxidant changes in spices and vegetables during processing. In this article was analyzed nine fresh and dried spices and vegetables- celery (Apium graveolens), parsley (Petroselinum crispum), dill (Anethum graveolens), leek (Allium ampeloprasum L.), garlic (Allium sativum L.), onion (Allium cepa), celery root (Apium graveolens var. rapaceum), pumpkin (Curcubica maxima), carrot (Daucus carota)- grown in Latvia 2013. Total carotenoids and phenolic compounds and their antiradical scavenging activity were determined for all samples. Dry matter content was calculated from moisture content. After drying process carotenoid content significantly decreases in all analyzed samples, except one -carotenoid content increases in parsley. Phenolic composition was different and depends on sample – fresh or dried. Total phenolic, flavonoid and phenolic acid content increases in dried spices. Flavan-3-ol content is not detected in fresh spice samples. For dried vegetables- phenolic acid content decreases significantly, but increases flavan-3-ols content. The higher antiradical scavenging activity was observed in samples with higher flavonoid and phenolic acid content.

Keywords: antiradical scavenging activity, carotenoids, phenolic compounds, spices, vegetables

Procedia PDF Downloads 250
1407 A Survey Proposal towards Holistic Management of Schizophrenia

Authors: Pronab Ganguly, Ahmed A. Moustafa

Abstract:

Holistic management of schizophrenia involves mainstream pharmacological intervention, complimentary medicine intervention, therapeutic intervention and other psychosocial factors such as accommodation, education, job training, employment, relationship, friendship, exercise, overall well-being, smoking, substance abuse, suicide prevention, stigmatisation, recreation, entertainment, violent behaviour, arrangement of public trusteeship and guardianship, day-day-living skill, integration with community, and management of overweight due to medications and other health complications related to medications amongst others. Our review shows that there is no integrated survey by combining all these factors. An international web-based survey was conducted to evaluate the significance of all these factors and present them in a unified manner. It is believed this investigation will contribute positively towards holistic management of schizophrenia. There will be two surveys. In the pharmacological intervention survey, five popular drugs for schizophrenia will be chosen and their efficacy as well as harmful side effects will be evaluated on a scale of 0 -10. This survey will be done by psychiatrists. In the second survey, each element of therapeutic intervention and psychosocial factors will be evaluated according to their significance on a scale of 0 - 10. This survey will be done by care givers, psychologists, case managers and case workers. For the first survey, professional bodies of psychiatrists in English speaking countries will be contacted to request them to ask their members to participate in the survey. For the second survey, professional bodies of clinical psychologist and care givers in English speaking countries will be contacted to request them to ask their members to participate in the survey. Additionally, for both the surveys, relevant professionals will be contacted through personal contact networks. For both the surveys, mean, mode, median, standard deviation and net promoter score will be calculated for each factor and then presented in a statistically significant manner. Subsequently each factor will be ranked according to their statistical significance. Additionally, country specific variation will be highlighted to identify the variation pattern. The results of these surveys will identify the relative significance of each type of pharmacological intervention, each type of therapeutic intervention and each type of psychosocial factor. The determination of this relative importance will definitely contribute to the improvement in quality of life for individuals with schizophrenia.

Keywords: schizophrenia, holistic management, antipsychotics, quality of life

Procedia PDF Downloads 128
1406 Copy Number Variants in Children with Non-Syndromic Congenital Heart Diseases from Mexico

Authors: Maria Lopez-Ibarra, Ana Velazquez-Wong, Lucelli Yañez-Gutierrez, Maria Araujo-Solis, Fabio Salamanca-Gomez, Alfonso Mendez-Tenorio, Haydeé Rosas-Vargas

Abstract:

Congenital heart diseases (CHD) are the most common congenital abnormalities. These conditions can occur as both an element of distinct chromosomal malformation syndromes or as non-syndromic forms. Their etiology is not fully understood. Genetic variants such copy number variants have been associated with CHD. The aim of our study was to analyze these genomic variants in peripheral blood from Mexican children diagnosed with non-syndromic CHD. We included 16 children with atrial and ventricular septal defects and 5 healthy subjects without heart malformations as controls. To exclude the most common heart disease-associated syndrome alteration, we performed a fluorescence in situ hybridization test to identify the 22q11.2, responsible for congenital heart abnormalities associated with Di-George Syndrome. Then, a microarray based comparative genomic hybridization was used to identify global copy number variants. The identification of copy number variants resulted from the comparison and analysis between our results and data from main genetic variation databases. We identified copy number variants gain in three chromosomes regions from pediatric patients, 4q13.2 (31.25%), 9q34.3 (25%) and 20q13.33 (50%), where several genes associated with cellular, biosynthetic, and metabolic processes are located, UGT2B15, UGT2B17, SNAPC4, SDCCAG3, PMPCA, INPP6E, C9orf163, NOTCH1, C20orf166, and SLCO4A1. In addition, after a hierarchical cluster analysis based on the fluorescence intensity ratios from the comparative genomic hybridization, two congenital heart disease groups were generated corresponding to children with atrial or ventricular septal defects. Further analysis with a larger sample size is needed to corroborate these copy number variants as possible biomarkers to differentiate between heart abnormalities. Interestingly, the 20q13.33 gain was present in 50% of children with these CHD which could suggest that alterations in both coding and non-coding elements within this chromosomal region may play an important role in distinct heart conditions.

Keywords: aCGH, bioinformatics, congenital heart diseases, copy number variants, fluorescence in situ hybridization

Procedia PDF Downloads 272
1405 CT Medical Images Denoising Based on New Wavelet Thresholding Compared with Curvelet and Contourlet

Authors: Amir Moslemi, Amir movafeghi, Shahab Moradi

Abstract:

One of the most important challenging factors in medical images is nominated as noise.Image denoising refers to the improvement of a digital medical image that has been infected by Additive White Gaussian Noise (AWGN). The digital medical image or video can be affected by different types of noises. They are impulse noise, Poisson noise and AWGN. Computed tomography (CT) images are subjected to low quality due to the noise. The quality of CT images is dependent on the absorbed dose to patients directly in such a way that increase in absorbed radiation, consequently absorbed dose to patients (ADP), enhances the CT images quality. In this manner, noise reduction techniques on the purpose of images quality enhancement exposing no excess radiation to patients is one the challenging problems for CT images processing. In this work, noise reduction in CT images was performed using two different directional 2 dimensional (2D) transformations; i.e., Curvelet and Contourlet and Discrete wavelet transform(DWT) thresholding methods of BayesShrink and AdaptShrink, compared to each other and we proposed a new threshold in wavelet domain for not only noise reduction but also edge retaining, consequently the proposed method retains the modified coefficients significantly that result in good visual quality. Data evaluations were accomplished by using two criterions; namely, peak signal to noise ratio (PSNR) and Structure similarity (Ssim).

Keywords: computed tomography (CT), noise reduction, curve-let, contour-let, signal to noise peak-peak ratio (PSNR), structure similarity (Ssim), absorbed dose to patient (ADP)

Procedia PDF Downloads 427
1404 Levels of Heavy Metals and Arsenic in Sediment and in Clarias Gariepinus, of Lake Ngami

Authors: Nashaat Mazrui, Oarabile Mogobe, Barbara Ngwenya, Ketlhatlogile Mosepele, Mangaliso Gondwe

Abstract:

Over the last several decades, the world has seen a rapid increase in activities such as deforestation, agriculture, and energy use. Subsequently, trace elements are being deposited into our water bodies, where they can accumulate to toxic levels in aquatic organisms and can be transferred to humans through fish consumption. Thus, though fish is a good source of essential minerals and omega-3 fatty acids, it can also be a source of toxic elements. Monitoring trace elements in fish is important for the proper management of aquatic systems and the protection of human health. The aim of this study was to determine concentrations of trace elements in sediment and muscle tissues of Clarias gariepinus at Lake Ngami, in the Okavango Delta in northern Botswana, during low floods. The fish were bought from local fishermen, and samples of muscle tissue were acid-digested and analyzed for iron, zinc, copper, manganese, molybdenum, nickel, chromium, cadmium, lead, and arsenic using inductively coupled plasma optical emission spectroscopy (ICP-OES). Sediment samples were also collected and analyzed for the elements and for organic matter content. Results show that in all samples, iron was found in the greatest amount while cadmium was below the detection limit. Generally, the concentrations of elements in sediment were higher than in fish except for zinc and arsenic. While the concentration of zinc was similar in the two media, arsenic was almost 3 times higher in fish than sediment. To evaluate the risk to human health from fish consumption, the target hazard quotient (THQ) and cancer risk for an average adult in Botswana, sub-Saharan Africa, and riparian communities in the Okavango Delta was calculated for each element. All elements were found to be well below regulatory limits and do not pose a threat to human health except arsenic. The results suggest that other benthic feeding fish species could potentially have high arsenic levels too. This has serious implications for human health, especially riparian households to whom fish is a key component of food and nutrition security.

Keywords: Arsenic, African sharp tooth cat fish, Okavango delta, trace elements

Procedia PDF Downloads 177
1403 Combining Laser Scanning and High Dynamic Range Photography for the Presentation of Bloodstain Pattern Evidence

Authors: Patrick Ho

Abstract:

Bloodstain Pattern Analysis (BPA) forensic evidence can be complex, requiring effective courtroom presentation to ensure clear and comprehensive understanding of the analyst’s findings. BPA witness statements can often involve reference to spatial information (such as location of rooms, objects, walls) which, when coupled with classified blood patterns, may illustrate the reconstructed movements of suspects and injured parties. However, it may be difficult to communicate this information through photography alone, despite this remaining the UK’s established method for presenting BPA evidence. Through an academic-police partnership between the University of Warwick and West Midlands Police (WMP), an integrated 3D scanning and HDR photography workflow for BPA was developed. Homicide scenes were laser scanned and, after processing, the 3D models were utilised in the BPA peer-review process. The same 3D models were made available for court but were not always utilised. This workflow has improved the ease of presentation for analysts and provided 3D scene models that assist with the investigation. However, the effects of incorporating 3D scene models in judicial processes may need to be studied before they are adopted more widely. 3D models from a simulated crime scene and West Midlands Police cases approved for conference disclosure are presented. We describe how the workflow was developed and integrated into established practices at WMP, including peer-review processes and witness statement delivery in court, and explain the impact the work has had on the Criminal Justice System in the West Midlands.

Keywords: bloodstain pattern analysis, forensic science, criminal justice, 3D scanning

Procedia PDF Downloads 76
1402 Improve of Biomass Properties through Torrefaction Process

Authors: Malgorzata Walkowiak, Magdalena Witczak, Wojciech Cichy

Abstract:

Biomass is an important renewable energy source in Poland. As a biofuel, it has many advantages like renewable in noticeable time and relatively high energy potential. But disadvantages of biomass like high moisture content and hygroscopic nature causes that gaining, transport, storage and preparation for combustion become troublesome and uneconomic. Thermal modification of biomass can improve hydrophobic properties, increase its calorific value and natural resistance. This form of thermal processing is known as torrefaction. The aim of the study was to investigate the effect of the pre-heat treatment of wood and plant lignocellulosic raw materials on the properties of solid biofuels. The preliminary studies included pine, beech and willow wood and other lignocellulosic raw materials: mustard, hemp, grass stems, tobacco stalks, sunflower husks, Miscanthus straw, rape straw, cereal straw, Virginia Mallow straw, rapeseed meal. Torrefaction was carried out using variable temperatures and time of the process, depending on the material used. It was specified the weight loss and the ash content and calorific value was determined. It was found that the thermal treatment of the tested lignocellulosic raw materials is able to provide solid biofuel with improved properties. In the woody materials, the increase of the lower heating value was in the range of 0,3 MJ/kg (pine and beech) to 1,1 MJ/kg (willow), in non-woody materials – from 0,5 MJ/kg (tobacco stalks, Miscanthus) to 3,5 MJ/kg (rapeseed meal). The obtained results indicate for further research needs, particularly in terms of conditions of the torrefaction process.

Keywords: biomass, lignocellulosic materials, solid biofuels, torrefaction

Procedia PDF Downloads 222
1401 Machine Learning in Patent Law: How Genetic Breeding Algorithms Challenge Modern Patent Law Regimes

Authors: Stefan Papastefanou

Abstract:

Artificial intelligence (AI) is an interdisciplinary field of computer science with the aim of creating intelligent machine behavior. Early approaches to AI have been configured to operate in very constrained environments where the behavior of the AI system was previously determined by formal rules. Knowledge was presented as a set of rules that allowed the AI system to determine the results for specific problems; as a structure of if-else rules that could be traversed to find a solution to a particular problem or question. However, such rule-based systems typically have not been able to generalize beyond the knowledge provided. All over the world and especially in IT-heavy industries such as the United States, the European Union, Singapore, and China, machine learning has developed to be an immense asset, and its applications are becoming more and more significant. It has to be examined how such products of machine learning models can and should be protected by IP law and for the purpose of this paper patent law specifically, since it is the IP law regime closest to technical inventions and computing methods in technical applications. Genetic breeding models are currently less popular than recursive neural network method and deep learning, but this approach can be more easily described by referring to the evolution of natural organisms, and with increasing computational power; the genetic breeding method as a subset of the evolutionary algorithms models is expected to be regaining popularity. The research method focuses on patentability (according to the world’s most significant patent law regimes such as China, Singapore, the European Union, and the United States) of AI inventions and machine learning. Questions of the technical nature of the problem to be solved, the inventive step as such, and the question of the state of the art and the associated obviousness of the solution arise in the current patenting processes. Most importantly, and the key focus of this paper is the problem of patenting inventions that themselves are developed through machine learning. The inventor of a patent application must be a natural person or a group of persons according to the current legal situation in most patent law regimes. In order to be considered an 'inventor', a person must actually have developed part of the inventive concept. The mere application of machine learning or an AI algorithm to a particular problem should not be construed as the algorithm that contributes to a part of the inventive concept. However, when machine learning or the AI algorithm has contributed to a part of the inventive concept, there is currently a lack of clarity regarding the ownership of artificially created inventions. Since not only all European patent law regimes but also the Chinese and Singaporean patent law approaches include identical terms, this paper ultimately offers a comparative analysis of the most relevant patent law regimes.

Keywords: algorithms, inventor, genetic breeding models, machine learning, patentability

Procedia PDF Downloads 99
1400 Flow Sheet Development and Simulation of a Bio-refinery Annexed to Typical South African Sugar Mill

Authors: M. Ali Mandegari, S. Farzad, J. F. Görgens

Abstract:

Sugar is one of the main agricultural industries in South Africa and approximately livelihoods of one million South Africans are indirectly dependent on sugar industry which is economically struggling with some problems and should re-invent in order to ensure a long-term sustainability. Second generation bio-refinery is defined as a process to use waste fibrous for the production of bio-fuel, chemicals animal food, and electricity. Bio-ethanol is by far the most widely used bio-fuel for transportation worldwide and many challenges in front of bio-ethanol production were solved. Bio-refinery annexed to the existing sugar mill for production of bio-ethanol and electricity is proposed to sugar industry and is addressed in this study. Since flow-sheet development is the key element of the bio-ethanol process, in this work, a bio-refinery (bio-ethanol and electricity production) annexed to a typical South African sugar mill considering 65ton/h dry sugarcane bagasse and tops/trash as feedstock was simulated. Aspen PlusTM V8.6 was applied as simulator and realistic simulation development approach was followed to reflect the practical behavior of the plant. Latest results of other researches considering pretreatment, hydrolysis, fermentation, enzyme production, bio-ethanol production and other supplementary units such as evaporation, water treatment, boiler, and steam/electricity generation units were adopted to establish a comprehensive bio-refinery simulation. Steam explosion with SO2 was selected for pretreatment due to minimum inhibitor production and simultaneous saccharification and fermentation (SSF) configuration was adopted for enzymatic hydrolysis and fermentation of cellulose and hydrolyze. Bio-ethanol purification was simulated by two distillation columns with side stream and fuel grade bio-ethanol (99.5%) was achieved using molecular sieve in order to minimize the capital and operating costs. Also boiler and steam/power generation were completed using industrial design data. Results indicates 256.6 kg bio ethanol per ton of feedstock and 31 MW surplus power were attained from bio-refinery while the process consumes 3.5, 3.38, and 0.164 (GJ/ton per ton of feedstock) hot utility, cold utility and electricity respectively. Developed simulation is a threshold of variety analyses and developments for further studies.

Keywords: bio-refinery, bagasse, tops, trash, bio-ethanol, electricity

Procedia PDF Downloads 515
1399 Research on the Overall Protection of Historical Cities Based on the 'City Image' in Ancient Maps: Take the Ancient City of Shipu, Zhejiang, China as an Example

Authors: Xiaoya Yi, Yi He, Zhao Lu, Yang Zhang

Abstract:

In the process of rapid urbanization, many historical cities have undergone excessive demolition and construction under the protection and renewal mechanism. The original pattern of the city has been changed, the urban context has been cut off, and historical features have gradually been lost. The historical city gradually changed into the form of decentralization and fragmentation. The understanding of the ancient city includes two levels. The first one refers to the ancient city on the physical space, which defined an ancient city by its historic walls. The second refers to the public perception of the image, which is derived from people's spatial identification of the ancient city. In ancient China, people draw maps to show their way of understanding the city. Starting from ancient maps and exploring the spatial characteristics of traditional Chinese cities from the perspective of urban imagery is a key clue to understanding the spatial characteristics of historical cities on an overall level. The spatial characteristics of the urban image presented by the ancient map are summarized into two levels by typology. The first is the spatial pattern composed of the center, axis and boundary. The second is the space element that contains the city, street, and sign system. Taking the ancient city of Shipu as a typical case, the "city image" in the ancient map is analyzed as a prototype, and it is projected into the current urban space. The research found that after a long period of evolution, the historical spatial pattern of the ancient city has changed from “dominant” to “recessive control”, and the historical spatial elements are non-centralized and fragmented. The wall that serves as the boundary of the ancient city is transformed into “fragmentary remains”, the streets and lanes that serve as the axis of the ancient city are transformed into “structural remains”, and the symbols of the ancient city center are transformed into “site remains”. Based on this, the paper proposed the methods of controlling the protection of land boundaries, the protecting of the streets and lanes, and the selective restoring of the city wall system and the sign system by accurate assessment. In addition, this paper emphasizes the continuity of the ancient city's traditional spatial pattern and attempts to explore a holistic conservation method of the ancient city in the modern context.

Keywords: ancient city protection, ancient maps, Shipu ancient city, urban intention

Procedia PDF Downloads 111
1398 A Mixing Matrix Estimation Algorithm for Speech Signals under the Under-Determined Blind Source Separation Model

Authors: Jing Wu, Wei Lv, Yibing Li, Yuanfan You

Abstract:

The separation of speech signals has become a research hotspot in the field of signal processing in recent years. It has many applications and influences in teleconferencing, hearing aids, speech recognition of machines and so on. The sounds received are usually noisy. The issue of identifying the sounds of interest and obtaining clear sounds in such an environment becomes a problem worth exploring, that is, the problem of blind source separation. This paper focuses on the under-determined blind source separation (UBSS). Sparse component analysis is generally used for the problem of under-determined blind source separation. The method is mainly divided into two parts. Firstly, the clustering algorithm is used to estimate the mixing matrix according to the observed signals. Then the signal is separated based on the known mixing matrix. In this paper, the problem of mixing matrix estimation is studied. This paper proposes an improved algorithm to estimate the mixing matrix for speech signals in the UBSS model. The traditional potential algorithm is not accurate for the mixing matrix estimation, especially for low signal-to noise ratio (SNR).In response to this problem, this paper considers the idea of an improved potential function method to estimate the mixing matrix. The algorithm not only avoids the inuence of insufficient prior information in traditional clustering algorithm, but also improves the estimation accuracy of mixing matrix. This paper takes the mixing of four speech signals into two channels as an example. The results of simulations show that the approach in this paper not only improves the accuracy of estimation, but also applies to any mixing matrix.

Keywords: DBSCAN, potential function, speech signal, the UBSS model

Procedia PDF Downloads 120
1397 Production of Cellulose Nanowhiskers from Red Algae Waste and Its Application in Polymer Composite Development

Authors: Z. Kassab, A. Aboulkas, A. Barakat, M. El Achaby

Abstract:

The red algae are available enormously around the world and their exploitation for the production of agar product has become as an important industry in recent years. However, this industrial processing of red algae generated a large quantity of solid fibrous wastes, which constitute a source of a serious environmental problem. For this reason, the exploitation of this solid waste would help to i) produce new value-added materials and ii) to improve waste disposal from environment. In fact, this solid waste can be fully utilized for the production of cellulose microfibers and nanocrystals because it consists of large amount of cellulose component. For this purpose, the red algae waste was chemically treated via alkali, bleaching and acid hydrolysis treatments with controlled conditions, in order to obtain pure cellulose microfibers and cellulose nanocrystals. The raw product and the as-extracted cellulosic materials were successively characterized using serval analysis techniques, including elemental analysis, X-ray diffraction, thermogravimetric analysis, infrared spectroscopy and transmission electron microscopy. As an application, the as extracted cellulose nanocrystals were used as nanofillers for the production of polymer-based composite films with improved thermal and tensile properties. In these composite materials, the adhesion properties and the large number of functional groups that are presented in the CNC’s surface and the macromolecular chains of the polymer matrix are exploited to improve the interfacial interactions between the both phases, improving the final properties. Consequently, the high performances of these composite materials can be expected to have potential in packaging material applications.

Keywords: cellulose nanowhiskers, food packaging, polymer composites, red algae waste

Procedia PDF Downloads 210
1396 Semantic Indexing Improvement for Textual Documents: Contribution of Classification by Fuzzy Association Rules

Authors: Mohsen Maraoui

Abstract:

In the aim of natural language processing applications improvement, such as information retrieval, machine translation, lexical disambiguation, we focus on statistical approach to semantic indexing for multilingual text documents based on conceptual network formalism. We propose to use this formalism as an indexing language to represent the descriptive concepts and their weighting. These concepts represent the content of the document. Our contribution is based on two steps. In the first step, we propose the extraction of index terms using the multilingual lexical resource Euro WordNet (EWN). In the second step, we pass from the representation of index terms to the representation of index concepts through conceptual network formalism. This network is generated using the EWN resource and pass by a classification step based on association rules model (in attempt to discover the non-taxonomic relations or contextual relations between the concepts of a document). These relations are latent relations buried in the text and carried by the semantic context of the co-occurrence of concepts in the document. Our proposed indexing approach can be applied to text documents in various languages because it is based on a linguistic method adapted to the language through a multilingual thesaurus. Next, we apply the same statistical process regardless of the language in order to extract the significant concepts and their associated weights. We prove that the proposed indexing approach provides encouraging results.

Keywords: concept extraction, conceptual network formalism, fuzzy association rules, multilingual thesaurus, semantic indexing

Procedia PDF Downloads 129
1395 Stochastic Optimization of a Vendor-Managed Inventory Problem in a Two-Echelon Supply Chain

Authors: Bita Payami-Shabestari, Dariush Eslami

Abstract:

The purpose of this paper is to develop a multi-product economic production quantity model under vendor management inventory policy and restrictions including limited warehouse space, budget, and number of orders, average shortage time and maximum permissible shortage. Since the “costs” cannot be predicted with certainty, it is assumed that data behave under uncertain environment. The problem is first formulated into the framework of a bi-objective of multi-product economic production quantity model. Then, the problem is solved with three multi-objective decision-making (MODM) methods. Then following this, three methods had been compared on information on the optimal value of the two objective functions and the central processing unit (CPU) time with the statistical analysis method and the multi-attribute decision-making (MADM). The results are compared with statistical analysis method and the MADM. The results of the study demonstrate that augmented-constraint in terms of optimal value of the two objective functions and the CPU time perform better than global criteria, and goal programming. Sensitivity analysis is done to illustrate the effect of parameter variations on the optimal solution. The contribution of this research is the use of random costs data in developing a multi-product economic production quantity model under vendor management inventory policy with several constraints.

Keywords: economic production quantity, random cost, supply chain management, vendor-managed inventory

Procedia PDF Downloads 113
1394 Processing and Modeling of High-Resolution Geophysical Data for Archaeological Prospection, Nuri Area, Northern Sudan

Authors: M. Ibrahim Ali, M. El Dawi, M. A. Mohamed Ali

Abstract:

In this study, the use of magnetic gradient survey, and the geoelectrical ground methods used together to explore archaeological features in Nuri’s pyramids area. Research methods used and the procedures and methodologies have taken full right during the study. The magnetic survey method was used to search for archaeological features using (Geoscan Fluxgate Gradiometer (FM36)). The study area was divided into a number of squares (networks) exactly equal (20 * 20 meters). These squares were collected at the end of the study to give a major network for each region. Networks also divided to take the sample using nets typically equal to (0.25 * 0.50 meter), in order to give a more specific archaeological features with some small bipolar anomalies that caused by buildings built from fired bricks. This definition is important to monitor many of the archaeological features such as rooms and others. This main network gives us an integrated map displayed for easy presentation, and it also allows for all the operations required using (Geoscan Geoplot software). The parallel traverse is the main way to take readings of the magnetic survey, to get out the high-quality data. The study area is very rich in old buildings that vary from small to very large. According to the proportion of the sand dunes and the loose soil, most of these buildings are not visible from the surface. Because of the proportion of the sandy dry soil, there is no connection between the ground surface and the electrodes. We tried to get electrical readings by adding salty water to the soil, but, unfortunately, we failed to confirm the magnetic readings with electrical readings as previously planned.

Keywords: archaeological features, independent grids, magnetic gradient, Nuri pyramid

Procedia PDF Downloads 469
1393 Effect of Non-metallic Inclusion from the Continuous Casting Process on the Multi-Stage Forging Process and the Tensile Strength of the Bolt: Case Study

Authors: Tomasz Dubiel, Tadeusz Balawender, Miroslaw Osetek

Abstract:

The paper presents the influence of non-metallic inclusions on the multi-stage forging process and the mechanical properties of the dodecagon socket bolt used in the automotive industry. The detected metallurgical defect was so large that it directly influenced the mechanical properties of the bolt and resulted in failure to meet the requirements of the mechanical property class. In order to assess the defect, an X-ray examination and metallographic examination of the defective bolt were performed, showing exogenous non-metallic inclusion. The size of the defect on the cross-section was 0.531 [mm] in width and 1.523 [mm] in length; the defect was continuous along the entire axis of the bolt. In analysis, a FEM simulation of the multi-stage forging process was designed, taking into account a non-metallic inclusion parallel to the sample axis, reflecting the studied case. The process of defect propagation due to material upset in the head area was analyzed. The final forging stage in shaping the dodecagonal socket and filling the flange area was particularly studied. The effect of the defect was observed to significantly reduce the effective cross-section as a result of the expansion of the defect perpendicular to the axis of the bolt. The mechanical properties of products with and without the defect were analyzed. In the first step, the hardness test confirmed that the required value for the mechanical class 8.8 of both bolt types was obtained. In the second step, the bolts were subjected to a static tensile test. The bolts without the defect gave a positive result, while all 10 bolts with the defect gave a negative result, achieving a tensile strength below the requirements. Tensile strength tests were confirmed by metallographic tests and FEM simulation with perpendicular inclusion spread in the area of the head. The bolts were damaged directly under the bolt head, which is inconsistent with the requirements of ISO 898-1. It has been shown that non-metallic inclusions with orientation in accordance with the axis of the bolt can directly cause loss of functionality and these defects should be detected even before assembling in the machine element.

Keywords: continuous casting, multi-stage forging, non-metallic inclusion, upset bolt head

Procedia PDF Downloads 145
1392 Soil Salinity from Wastewater Irrigation in Urban Greenery

Authors: H. Nouri, S. Chavoshi Borujeni, S. Anderson, S. Beecham, P. Sutton

Abstract:

The potential risk of salt leaching through wastewater irrigation is of concern for most local governments and city councils. Despite the necessity of salinity monitoring and management in urban greenery, most attention has been on agricultural fields. This study was defined to investigate the capability and feasibility of monitoring and predicting soil salinity using near sensing and remote sensing approaches using EM38 surveys, and high-resolution multispectral image of WorldView3. Veale Gardens within the Adelaide Parklands was selected as the experimental site. The results of the near sensing investigation were validated by testing soil salinity samples in the laboratory. Over 30 band combinations forming salinity indices were tested using image processing techniques. The outcomes of the remote sensing and near sensing approaches were compared to examine whether remotely sensed salinity indicators could map and predict the spatial variation of soil salinity through a potential statistical model. Statistical analysis was undertaken using the Stata 13 statistical package on over 52,000 points. Several regression models were fitted to the data, and the mixed effect modelling was selected the most appropriate one as it takes to account the systematic observation-specific unobserved heterogeneity. Results showed that SAVI (Soil Adjusted Vegetation Index) was the only salinity index that could be considered as a predictor for soil salinity but further investigation is needed. However, near sensing was found as a rapid, practical and realistically accurate approach for salinity mapping of heterogeneous urban vegetation.

Keywords: WorldView3, remote sensing, EM38, near sensing, urban green spaces, green smart cities

Procedia PDF Downloads 148
1391 Liver and Liver Lesion Segmentation From Abdominal CT Scans

Authors: Belgherbi Aicha, Hadjidj Ismahen, Bessaid Abdelhafid

Abstract:

The interpretation of medical images benefits from anatomical and physiological priors to optimize computer- aided diagnosis applications. Segmentation of liver and liver lesion is regarded as a major primary step in computer aided diagnosis of liver diseases. Precise liver segmentation in abdominal CT images is one of the most important steps for the computer-aided diagnosis of liver pathology. In this papers, a semi- automated method for medical image data is presented for the liver and liver lesion segmentation data using mathematical morphology. Our algorithm is currency in two parts. In the first, we seek to determine the region of interest by applying the morphological filters to extract the liver. The second step consists to detect the liver lesion. In this task; we proposed a new method developed for the semi-automatic segmentation of the liver and hepatic lesions. Our proposed method is based on the anatomical information and mathematical morphology tools used in the image processing field. At first, we try to improve the quality of the original image and image gradient by applying the spatial filter followed by the morphological filters. The second step consists to calculate the internal and external markers of the liver and hepatic lesions. Thereafter we proceed to the liver and hepatic lesions segmentation by the watershed transform controlled by markers. The validation of the developed algorithm is done using several images. Obtained results show the good performances of our proposed algorithm

Keywords: anisotropic diffusion filter, CT images, hepatic lesion segmentation, Liver segmentation, morphological filter, the watershed algorithm

Procedia PDF Downloads 436
1390 A Critical Appraisal of Adekunle Ajasin University Policy on Internet Resource Centre in Service Delivery Adekunle Ajasin University, Akungba-Akoko, Ondo State

Authors: Abimbola Olaotan Akinsete

Abstract:

Government all over the world has intensified efforts in making internet and resource centres readily available in public institutions and centres for the advancement of humanity and working processes. Information and communication resource centre will not only help in the reduction of task that are presumed to be herculean. This centres influenced the working rate and productivity of both staffs and students and its benefit. The utilization of the internet and information resource centre will not only speed up service delivery, working time and efficiency of the system. Information and Communication Technology plays significant roles in presenting equalization strategy for developing university community and improving educational service delivery. This equalization will not only advance, accelerate and ensure results are accessed electronically, ensuring the transfer and confirmation of students’ academic records and their results in the world without physically available to request for these services. This study seeks to make Critical Appraisal of Adekunle Ajasin University Policy on Internet Resource Centre in Service Delivery Adekunle Ajasin University, Akungba-Akoko, Ondo State. The study employ descriptive survey design method in identifying hindrances of the non-utilization of technology in the service delivery in the university. Findings revealed that the adoption of internet and resource centre in the Exams and Records unit of the University shall help in delivering more in students’ records/results processing.

Keywords: internet, resource, centre, policy and service delivery

Procedia PDF Downloads 84
1389 The Effect of Additive Acid on the Phytoremediation Efficiency

Authors: G. Hosseini, A. Sadighzadeh, M. Rahimnejad, N. Hosseini, Z. Jamalzadeh

Abstract:

Metal pollutants, especially heavy metals from anthropogenic sources such as metallurgical industries’ waste including mining, smelting, casting or production of nuclear fuel, including mining, concentrate production and uranium processing ends in the environment contamination (water and soil) and risk to human health around the facilities of this type of industrial activity. There are different methods that can be used to remove these contaminants from water and soil. These are very expensive and time-consuming. In this case, the people have been forced to leave the area and the decontamination is not done. For example, in the case of Chernobyl accident, an area of 30 km around the plant was emptied of human life. A very efficient and cost-effective method for decontamination of the soil and the water is phytoremediation. In this method, the plants preferentially native plants which are more adaptive to the regional climate are well used. In this study, three types of plants including Alfalfa, Sunflower and wheat were used to Barium decontamination. Alfalfa and Sunflower were not grown good enough in Saghand mine’s soil sample. This can be due to non-native origin of these plants. But, Wheat rise in Saghand Uranium Mine soil sample was satisfactory. In this study, we have investigated the effect of 4 types of acids inclusive nitric acid, oxalic acid, acetic acid and citric acid on the removal efficiency of Barium by Wheat. Our results indicate the increase of Barium absorption in the presence of citric acid in the soil. In this paper, we will present our research and laboratory results.

Keywords: phytoremediation, heavy metal, wheat, soil

Procedia PDF Downloads 321
1388 35 MHz Coherent Plane Wave Compounding High Frequency Ultrasound Imaging

Authors: Chih-Chung Huang, Po-Hsun Peng

Abstract:

Ultrasound transient elastography has become a valuable tool for many clinical diagnoses, such as liver diseases and breast cancer. The pathological tissue can be distinguished by elastography due to its stiffness is different from surrounding normal tissues. An ultrafast frame rate of ultrasound imaging is needed for transient elastography modality. The elastography obtained in the ultrafast system suffers from a low quality for resolution, and affects the robustness of the transient elastography. In order to overcome these problems, a coherent plane wave compounding technique has been proposed for conventional ultrasound system which the operating frequency is around 3-15 MHz. The purpose of this study is to develop a novel beamforming technique for high frequency ultrasound coherent plane-wave compounding imaging and the simulated results will provide the standards for hardware developments. Plane-wave compounding imaging produces a series of low-resolution images, which fires whole elements of an array transducer in one shot with different inclination angles and receives the echoes by conventional beamforming, and compounds them coherently. Simulations of plane-wave compounding image and focused transmit image were performed using Field II. All images were produced by point spread functions (PSFs) and cyst phantoms with a 64-element linear array working at 35MHz center frequency, 55% bandwidth, and pitch of 0.05 mm. The F number is 1.55 in all the simulations. The simulated results of PSFs and cyst phantom which were obtained using single, 17, 43 angles plane wave transmission (angle of each plane wave is separated by 0.75 degree), and focused transmission. The resolution and contrast of image were improved with the number of angles of firing plane wave. The lateral resolutions for different methods were measured by -10 dB lateral beam width. Comparison of the plane-wave compounding image and focused transmit image, both images exhibited the same lateral resolution of 70 um as 37 angles were performed. The lateral resolution can reach 55 um as the plane-wave was compounded 47 angles. All the results show the potential of using high-frequency plane-wave compound imaging for realizing the elastic properties of the microstructure tissue, such as eye, skin and vessel walls in the future.

Keywords: plane wave imaging, high frequency ultrasound, elastography, beamforming

Procedia PDF Downloads 516
1387 Empowering Women through the Fishermen of Functional Skills for City Gorontalo Indonesia

Authors: Abdul Rahmat

Abstract:

Community-based education in the economic empowerment of the family is an attempt to accelerate human development index (HDI) Dumbo Kingdom District of Gorontalo economics (purchasing power) program developed in this activity is the manufacture of functional skills shredded fish, fish balls, fish nuggets, chips anchovies, and corn sticks fish. The target audience of this activity is fishing se mothers subdistrict Dumbo Kingdom include Talumolo Village, Village Botu, Kampung Bugis Village, Village North and Sub Leato South Leato that each village is represented by 20 participants so totaling 100 participants. Time activities beginning in October s/d November 2014 held once a week on every Saturday at 9.00 s/d 13:00/14:00. From the results of the learning process of testing the skills of functional skills of making shredded fish, fish balls, fish nuggets, chips anchovies, fish and corn sticks residents have additional knowledge and experience are: 1) Order the concept include: nutrient content, processing food with fish raw materials , variations in taste, packaging, pricing and marketing sales. 2) Products made: in accordance with the wishes of the residents learned that estimated Eligible selling, product packaging logo creation, preparation and realization of the establishment of Business Study Group (KBU) and pioneered the marketing network with restaurant, store / shop staple food vendors that are around CLC.

Keywords: community development, functional skills, gender, HDI

Procedia PDF Downloads 303
1386 High Aspect Ratio Micropillar Array Based Microfluidic Viscometer

Authors: Ahmet Erten, Adil Mustafa, Ayşenur Eser, Özlem Yalçın

Abstract:

We present a new viscometer based on a microfluidic chip with elastic high aspect ratio micropillar arrays. The displacement of pillar tips in flow direction can be used to analyze viscosity of liquid. In our work, Computational Fluid Dynamics (CFD) is used to analyze pillar displacement of various micropillar array configurations in flow direction at different viscosities. Following CFD optimization, micro-CNC based rapid prototyping is used to fabricate molds for microfluidic chips. Microfluidic chips are fabricated out of polydimethylsiloxane (PDMS) using soft lithography methods with molds machined out of aluminum. Tip displacements of micropillar array (300 µm in diameter and 1400 µm in height) in flow direction are recorded using a microscope mounted camera, and the displacements are analyzed using image processing with an algorithm written in MATLAB. Experiments are performed with water-glycerol solutions mixed at 4 different ratios to attain 1 cP, 5 cP, 10 cP and 15 cP viscosities at room temperature. The prepared solutions are injected into the microfluidic chips using a syringe pump at flow rates from 10-100 mL / hr and the displacement versus flow rate is plotted for different viscosities. A displacement of around 1.5 µm was observed for 15 cP solution at 60 mL / hr while only a 1 µm displacement was observed for 10 cP solution. The presented viscometer design optimization is still in progress for better sensitivity and accuracy. Our microfluidic viscometer platform has potential for tailor made microfluidic chips to enable real time observation and control of viscosity changes in biological or chemical reactions.

Keywords: Computational Fluid Dynamics (CFD), high aspect ratio, micropillar array, viscometer

Procedia PDF Downloads 232
1385 Chronic Cognitive Impacts of Mild Traumatic Brain Injury during Aging

Authors: Camille Charlebois-Plante, Marie-Ève Bourassa, Gaelle Dumel, Meriem Sabir, Louis De Beaumont

Abstract:

To the extent of our knowledge, there has been little interest in the chronic effects of mild traumatic brain injury (mTBI) on cognition during normal aging. This is rather surprising considering the impacts on daily and social functioning. In addition, sustaining a mTBI during late adulthood may increase the effect of normal biological aging in individuals who consider themselves normal and healthy. The objective of this study was to characterize the persistent neuropsychological repercussions of mTBI sustained during late adulthood, on average 12 months prior to testing. To this end, 35 mTBI patients and 42 controls between the ages of 50 and 69 completed an exhaustive neuropsychological assessment lasting three hours. All mTBI patients were asymptomatic and all participants had a score ≥ 27 at the MoCA. The evaluation consisted of 20 standardized neuropsychological tests measuring memory, attention, executive and language functions, as well as information processing speed. Performance on tests of visual (Brief Visuospatial Memory Test Revised) and verbal memory (Rey Auditory Verbal Learning Test and WMS-IV Logical Memory subtest), lexical access (Boston Naming Test) and response inhibition (Stroop) revealed to be significantly lower in the mTBI group. These findings suggest that a mTBI sustained during late adulthood induces lasting effects on cognitive function. Episodic memory and executive functions seem to be particularly vulnerable to enduring mTBI effects.

Keywords: cognitive function, late adulthood, mild traumatic brain injury, neuropsychology

Procedia PDF Downloads 158