Search results for: intelligent techniques
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 7416

Search results for: intelligent techniques

5586 Brain-Computer Interface System for Lower Extremity Rehabilitation of Chronic Stroke Patients

Authors: Marc Sebastián-Romagosa, Woosang Cho, Rupert Ortner, Christy Li, Christoph Guger

Abstract:

Neurorehabilitation based on Brain-Computer Interfaces (BCIs) shows important rehabilitation effects for patients after stroke. Previous studies have shown improvements for patients that are in a chronic stage and/or have severe hemiparesis and are particularly challenging for conventional rehabilitation techniques. For this publication, seven stroke patients in the chronic phase with hemiparesis in the lower extremity were recruited. All of them participated in 25 BCI sessions about 3 times a week. The BCI system was based on the Motor Imagery (MI) of the paretic ankle dorsiflexion and healthy wrist dorsiflexion with Functional Electrical Stimulation (FES) and avatar feedback. Assessments were conducted to assess the changes in motor improvement before, after and during the rehabilitation training. Our primary measures used for the assessment were the 10-meters walking test (10MWT), Range of Motion (ROM) of the ankle dorsiflexion and Timed Up and Go (TUG). Results show a significant increase in the gait speed in the primary measure 10MWT fast velocity of 0.18 m/s IQR = [0.12 to 0.2], P = 0.016. The speed in the TUG was also significantly increased by 0.1 m/s IQR = [0.09 to 0.11], P = 0.031. The active ROM assessment increased 4.65º, and IQR = [ 1.67 - 7.4], after rehabilitation training, P = 0.029. These functional improvements persisted at least one month after the end of the therapy. These outcomes show the feasibility of this BCI approach for chronic stroke patients and further support the growing consensus that these types of tools might develop into a new paradigm for rehabilitation tools for stroke patients. However, the results are from only seven chronic stroke patients, so the authors believe that this approach should be further validated in broader randomized controlled studies involving more patients. MI and FES-based non-invasive BCIs are showing improvement in the gait rehabilitation of patients in the chronic stage after stroke. This could have an impact on the rehabilitation techniques used for these patients, especially when they are severely impaired and their mobility is limited.

Keywords: neuroscience, brain computer interfaces, rehabilitat, stroke

Procedia PDF Downloads 92
5585 The Influence of Brands in E-Sports Spectators

Authors: Rene Kasper, Hyago Ribeiro, Marcelo Curth

Abstract:

Electronic sports, or just e-sports, boast an exponential growth in the interest of the public and large investors. The e-sports teams are equal to classic sports teams, like football, since in their structure they have, besides the athletes, administrators, coaches and even doctors. The concept of team games arises with a very strong social interaction, as it is perceived that users interact with real peers rather than competing with intelligent software. In this sense, electronic games are established as a sociocultural phenomenon and as multidimensional media. Thus, the research aims to identify the profile of users and the importance of brands in the Brazilian electronic sports scene, as well as the relationship of consumers (called fans) with the products and services that occupy the media spaces of the transmissions of sports championships. The research used descriptive quantitative methodology, applied in different e-sports communities, with 160 respondents. The data collection instrument was a survey containing seven questions, which addressed the profile of the participants and their perception on the proposed theme in research. Regarding the profile, the age ranged from 17 to 31 years, of which 93.3% were male and 6.7% female. It was found that 93.3% of the participants had contact with the Brazilian electronic sports scene for at least 2 years, of which 26.7% played between 6 and 12 hours a week and 46.7% played more than 12 hours a week. In addition, it was noticed that income was not a deciding factor to enjoy electronic sports games, because the percentage distribution of participants ranged from 1 to 3 minimum wages (33.3%) and greater than 6 salaries (46.7 %). Regarding the brands, 85.6% emphasized that brands should support the scenario through sponsorship and publicity and 28.6% are attracted to consume brands that advertise in e-sports championships.

Keywords: brands, consumer behavior, e-sports, virtual games

Procedia PDF Downloads 275
5584 Spectroscopy and Electron Microscopy for the Characterization of CdSxSe1-x Quantum Dots in a Glass Matrix

Authors: C. Fornacelli, P. Colomban, E. Mugnaioli, I. Memmi Turbanti

Abstract:

When semiconductor particles are reduced in scale to nanometer dimension, their optical and electro-optical properties strongly differ from those of bulk crystals of the same composition. Since sampling is often not allowed concerning cultural heritage artefacts, the potentialities of two non-invasive techniques, such as Raman and Fiber Optic Reflectance Spectroscopy (FORS), have been investigated and the results of the analysis on some original glasses of different colours (from yellow to orange and deep red) and periods (from the second decade of the 20th century to present days) are reported in the present study. In order to evaluate the potentialities of the application of non-invasive techniques to the investigation of the structure and distribution of nanoparticles dispersed in a glass matrix, Scanning Electron Microscopy (SEM) and energy-disperse spectroscopy (EDS) mapping, together with Transmission Electron Microscopy (TEM) and Electron Diffraction Tomography (EDT) have also been used. Raman spectroscopy allows a fast and non-destructive measure of the quantum dots composition and size, thanks to the evaluation of the frequencies and the broadening/asymmetry of the LO phonons bands, respectively, though the important role of the compressive strain arising from the glass matrix and the possible diffusion of zinc from the matrix to the nanocrystals should be taken into account when considering the optical-phonons frequency values. The incorporation of Zn has been assumed by an upward shifting of the LO band related to the most abundant anion (S or Se), while the role of the surface phonons as well as the confinement-induced scattering by phonons with a non-zero wavevectors on the Raman peaks broadening has been verified. The optical band gap varies from 2.42 eV (pure CdS) to 1.70 eV (CdSe). For the compositional range between 0.5≤x≤0.2, the presence of two absorption edges has been related to the contribution of both pure CdS and the CdSxSe1-x solid solution; this particular feature is probably due to the presence of unaltered cubic zinc blende structures of CdS that is not taking part to the formation of the solid solution occurring only between hexagonal CdS and CdSe. Moreover, the band edge tailing originating from the disorder due to the formation of weak bonds and characterized by the Urbach edge energy has been studied and, together with the FWHM of the Raman signal, has been assumed as a good parameter to evaluate the degree of topological disorder. SEM-EDS mapping showed a peculiar distribution of the major constituents of the glass matrix (fluxes and stabilizers), especially concerning those samples where a layered structure has been assumed thanks to the spectroscopic study. Finally, TEM-EDS and EDT were used to get high-resolution information about nanocrystals (NCs) and heterogeneous glass layers. The presence of ZnO NCs (< 4 nm) dispersed in the matrix has been verified for most of the samples, while, for those samples where a disorder due to a more complex distribution of the size and/or composition of the NCs has been assumed, the TEM clearly verified most of the assumption made by the spectroscopic techniques.

Keywords: CdSxSe1-x, EDT, glass, spectroscopy, TEM-EDS

Procedia PDF Downloads 299
5583 Awareness about Work-Related Hazards Causing Musculoskeletal Disorders

Authors: Bintou Jobe

Abstract:

Musculo-skeletal disorders (MSDs) are injuries or disorders of the spine disc, muscle strains, and low back injuries. It remains a major cause of occupational illness. Findings: Due to poor grips during handling, it is possible for neck, shoulder, arm, knees, ankle, fingers, waist, lower back injuries, and other muscle joints to be affected. Pregnant women are more prone to physical and hormonal changes, which lead to the relaxation of supporting ligaments. MSD continues to pose a global concern due to its impact on workers worldwide. The prevalence of the disorder is high, according to research into the workforce in Europe and developing countries. The causes are characterized by long working hours, insufficient rest breaks, poor posture, repetitive motion, poor manual handling techniques, psychological stress, and poor nutrition. To prevent MSD, the design mainly involves avoiding and assessing the risk. However, clinical solutions, policy governance, and minimizing manual labour are also an alternative. In addition, eating a balanced diet and teamwork force are key to elements in minimising the risk. This review aims to raise awareness and promote cost effectiveness prevention and understanding of MSD through research and identify proposed solutions to recognise the underlying causes of MSDs in the construction sectors. The methodology involves a literature review approach, engaging with the policy landscape of MSD, synthesising publications on MSD and a wider range of academic publications. In conclusion, training on effective manual handling techniques should be considered, and Personal Protective Equipment should be a last resort. The implementation of training guidelines has yielded significant benefits.

Keywords: musculoskeletal disorder work related, MSD, manual handling, work hazards

Procedia PDF Downloads 60
5582 Reimagine and Redesign: Augmented Reality Digital Technologies and 21st Century Education

Authors: Jasmin Cowin

Abstract:

Augmented reality digital technologies, big data, and the need for a teacher workforce able to meet the demands of a knowledge-based society are poised to lead to major changes in the field of education. This paper explores applications and educational use cases of augmented reality digital technologies for educational organizations during the Fourth Industrial Revolution. The Fourth Industrial Revolution requires vision, flexibility, and innovative educational conduits by governments and educational institutions to remain competitive in a global economy. Educational organizations will need to focus on teaching in and for a digital age to continue offering academic knowledge relevant to 21st-century markets and changing labor force needs. Implementation of contemporary disciplines will need to be embodied through learners’ active knowledge-making experiences while embracing ubiquitous accessibility. The power of distributed ledger technology promises major streamlining for educational record-keeping, degree conferrals, and authenticity guarantees. Augmented reality digital technologies hold the potential to restructure educational philosophies and their underpinning pedagogies thereby transforming modes of delivery. Structural changes in education and governmental planning are already increasing through intelligent systems and big data. Reimagining and redesigning education on a broad scale is required to plan and implement governmental and institutional changes to harness innovative technologies while moving away from the big schooling machine.

Keywords: fourth industrial revolution, artificial intelligence, big data, education, augmented reality digital technologies, distributed ledger technology

Procedia PDF Downloads 277
5581 Analysis of Fuel Efficiency in Heavy Construction Compaction Machine and Factors Affecting Fuel Efficiency

Authors: Amey Kulkarni, Paavan Shetty, Amol Patil, B. Rajiv

Abstract:

Fuel Efficiency plays a very important role in overall performance of an automobile. In this paper study of fuel efficiency of heavy construction, compaction machine is done. The fuel Consumption trials are performed in order to obtain the consumption of fuel in performing certain set of actions by the compactor. Usually, Heavy Construction machines are put to work in locations where refilling the fuel tank is not an easy task and also the fuel is consumed at a greater rate than a passenger automobile. So it becomes important to have a fuel efficient machine for long working hours. The fuel efficiency is the most important point in determining the future scope of the product. A heavy construction compaction machine operates in five major roles. These five roles are traveling, Static working, High-frequency Low amplitude compaction, Low-frequency High amplitude compaction, low idle. Fuel consumption readings for 1950 rpm, 2000 rpm & 2350 rpm of the engine are taken by using differential fuel flow meter and are analyzed. And the optimum RPM setting which fulfills the fuel efficiency, as well as engine performance criteria, is considered. Also, other factors such as rear end gears, Intake and exhaust restriction for an engine, vehicle operating techniques, air drag, Tribological aspects, Tires are considered for increasing the fuel efficiency of the compactor. The fuel efficiency of compactor can be precisely calculated by using Differential Fuel Flow Meter. By testing the compactor at different combinations of Engine RPM and also considering other factors such as rear end gears, Intake and exhaust restriction of an engine, vehicle operating techniques, air drag, Tribological aspects, The optimum solution was obtained which lead to significant improvement in fuel efficiency of the compactor.

Keywords: differential fuel flow meter, engine RPM, fuel efficiency, heavy construction compaction machine

Procedia PDF Downloads 291
5580 Feature Analysis of Predictive Maintenance Models

Authors: Zhaoan Wang

Abstract:

Research in predictive maintenance modeling has improved in the recent years to predict failures and needed maintenance with high accuracy, saving cost and improving manufacturing efficiency. However, classic prediction models provide little valuable insight towards the most important features contributing to the failure. By analyzing and quantifying feature importance in predictive maintenance models, cost saving can be optimized based on business goals. First, multiple classifiers are evaluated with cross-validation to predict the multi-class of failures. Second, predictive performance with features provided by different feature selection algorithms are further analyzed. Third, features selected by different algorithms are ranked and combined based on their predictive power. Finally, linear explainer SHAP (SHapley Additive exPlanations) is applied to interpret classifier behavior and provide further insight towards the specific roles of features in both local predictions and global model behavior. The results of the experiments suggest that certain features play dominant roles in predictive models while others have significantly less impact on the overall performance. Moreover, for multi-class prediction of machine failures, the most important features vary with type of machine failures. The results may lead to improved productivity and cost saving by prioritizing sensor deployment, data collection, and data processing of more important features over less importance features.

Keywords: automated supply chain, intelligent manufacturing, predictive maintenance machine learning, feature engineering, model interpretation

Procedia PDF Downloads 133
5579 Scientific Investigation for an Ancient Egyptian Polychrome Wooden Stele

Authors: Ahmed Abdrabou, Medhat Abdalla

Abstract:

The studied stele dates back to Third Intermediate Period (1075-664) B.C in an ancient Egypt. It is made of wood and covered with painted gesso layers. This study aims to use a combination of multi spectral imaging {visible, infrared (IR), Visible-induced infrared luminescence (VIL), Visible-induced ultraviolet luminescence (UVL) and ultraviolet reflected (UVR)}, along with portable x-ray fluorescence in order to map and identify the pigments as well as to provide a deeper understanding of the painting techniques. Moreover; the authors were significantly interested in the identification of wood species. Multispectral imaging acquired in 3 spectral bands, ultraviolet (360-400 nm), visible (400-780 nm) and infrared (780-1100 nm) using (UV Ultraviolet-induced luminescence (UVL), UV Reflected (UVR), Visible (VIS), Visible-induced infrared luminescence (VIL) and Infrared photography. False color images are made by digitally editing the VIS with IR or UV images using Adobe Photoshop. Optical Microscopy (OM), potable X-ray fluorescence spectroscopy (p-XRF) and Fourier Transform Infrared Spectroscopy (FTIR) were used in this study. Mapping and imaging techniques provided useful information about the spatial distribution of pigments, in particular visible-induced luminescence (VIL) which allowed the spatial distribution of Egyptian blue pigment to be mapped and every region containing Egyptian blue, even down to single crystals in some instances, is clearly visible as a bright white area; however complete characterization of the pigments requires the use of p. XRF spectroscopy. Based on the elemental analysis found by P.XRF, we conclude that the artists used mixtures of the basic mineral pigments to achieve a wider palette of hues. Identification of wood species Microscopic identification indicated that the wood used was Sycamore Fig (Ficus sycomorus L.) which is recorded as being native to Egypt and was used to make wooden artifacts since at least the Fifth Dynasty.

Keywords: polychrome wooden stele, multispectral imaging, IR luminescence, Wood identification, Sycamore Fig, p-XRF

Procedia PDF Downloads 264
5578 Inter-Communication-Management in Cases with Disabled Children (ICDC)

Authors: Dena A. Hussain

Abstract:

The objective of this project is to design an Information and Communication Technologies (ICT) tool based on a standardized platform to assist the work-integrated learning process of caretakers of disabled children. The tool should assist the intercommunication between caretakers and improve the learning process through knowledge bridging between all involved caretakers. Some children are born with disabilities while others have special needs after an illness or accident. Special needs children often need help in their learning process and require tools and services in a different way. In some cases the child has multiple disabilities that affect several capabilities in different ways. These needs are to be transformed into different learning techniques that the staff or personal (called caretakers in this project) caring for the child needs to learn and adapt. The caretakers involved are also required to learn new learning or training techniques and utilities specialized for the child’s needs. In many cases the number of people caring for the child’s development is rather large; the parents, specialist pedagogues, teachers, therapists, psychologists, personal assistants, etc. Each group of specialists has different objectives and in some cases the merge between theses specifications is very unique. This makes the synchronization between different caretakers difficult, resulting often in low level cooperation. By better intercommunication between professions both the child’s development could be improved but also the caretakers’ methods and knowledge of each other’s work processes and their own profession. This introduces a unique work integrated learning environment for all personnel involve, merging learning and knowledge in the work environment and at the same time assist the children’s development process. Creating an iterative process generates a unique learning experience for all involved. Using a work integrated platform will help encourage and support the process of all the teams involved in the process.We believe that working with children who have special needs is a continues learning/working process that is always integrated to achieve one main goal, which is to make a better future for all children.

Keywords: information and communication technologies (ICT), work integrated learning (WIL), sustainable learning, special needs children

Procedia PDF Downloads 294
5577 Evaluation of Classification Algorithms for Diagnosis of Asthma in Iranian Patients

Authors: Taha SamadSoltani, Peyman Rezaei Hachesu, Marjan GhaziSaeedi, Maryam Zolnoori

Abstract:

Introduction: Data mining defined as a process to find patterns and relationships along data in the database to build predictive models. Application of data mining extended in vast sectors such as the healthcare services. Medical data mining aims to solve real-world problems in the diagnosis and treatment of diseases. This method applies various techniques and algorithms which have different accuracy and precision. The purpose of this study was to apply knowledge discovery and data mining techniques for the diagnosis of asthma based on patient symptoms and history. Method: Data mining includes several steps and decisions should be made by the user which starts by creation of an understanding of the scope and application of previous knowledge in this area and identifying KD process from the point of view of the stakeholders and finished by acting on discovered knowledge using knowledge conducting, integrating knowledge with other systems and knowledge documenting and reporting.in this study a stepwise methodology followed to achieve a logical outcome. Results: Sensitivity, Specifity and Accuracy of KNN, SVM, Naïve bayes, NN, Classification tree and CN2 algorithms and related similar studies was evaluated and ROC curves were plotted to show the performance of the system. Conclusion: The results show that we can accurately diagnose asthma, approximately ninety percent, based on the demographical and clinical data. The study also showed that the methods based on pattern discovery and data mining have a higher sensitivity compared to expert and knowledge-based systems. On the other hand, medical guidelines and evidence-based medicine should be base of diagnostics methods, therefore recommended to machine learning algorithms used in combination with knowledge-based algorithms.

Keywords: asthma, datamining, classification, machine learning

Procedia PDF Downloads 447
5576 Operative versus Non-Operative Treatment of Scaphoid Non-Union in Children: A Case Presentation and Review of the Literature

Authors: Ilja Käch, Abdul R. Jandali, Nadja Zechmann-Müller

Abstract:

Introduction: We discuss the treatment of two young male patients suffering from scaphoid non-union after a traumatic scaphoid fracture. The currently propagated techniques for treating a scaphoid non-union in children are either the operative reconstruction of the scaphoid or the conservative treatment with splinting in a scaphoid cast. Cases: In the first case, we operated on a 13 years old male patient with a posttraumatic scaphoid non-union in the middle third with a humpback deformity. We resected the middle third of the scaphoid and grafted the defect with an iliac crest bone, and the DISI-Deformity was reduced. Fixation was performed with K-Wires and immobilisation in a scaphoid cast. In the second case a 13 years old male patient also with a posttraumatic scaphoid non-union in the middle third and humpback deformity, DISI-deformity, was treated conservatively. Immobilisation in a scaphoid cast for four months was performed. Results: Operative: One year postoperatively the patient achieved a painless free arc of motion. Flexion/Extension 70/0/60°, Radial-/Ulnarduction 30/0/30° and Pro-/Supination 90/0/90°. The computer tomogram showed complete consolidation and bony fusion of the iliac crest bone. Conservative: Six to eight months after conservative treatment the patient demonstrated painless motion and AROM Flexion/Extension 80/0/80°, Radial-/Ulnarduction and Pro-/Supination in maximum range. Complete consolidation in the computer tomogram with persistent humpback- and DISI deformity. Conclusion: In the literature, both techniques are described, either the operative scaphoid reconstruction or the conservative treatment with splinting. In our cases, both the operative and conservative treatments showed comparable good results. However, the humpback- and DISI deformity can only be addressed with a surgical approach.

Keywords: scaphoid, non-union, trauma, operative vs. non operative

Procedia PDF Downloads 76
5575 Spectral Properties of Fiber Bragg Gratings

Authors: Y. Hamaizi, H. Triki, A. El-Akrmi

Abstract:

In this paper, the reflection spectra, group delay and dispersion of a uniform fiber Bragg grating (FBG) are obtained. FBGs with two types of apodized variations of the refractive index were modeled to show how the side-lobes can be suppressed. Apodization techniques are used to get optimized reflection spectra. The simulation is based on solving coupled mode equations together with the transfer matrix method.

Keywords: fiber bragg gratings, coupled-mode theory, reflectivity, apodization

Procedia PDF Downloads 704
5574 3D Biomechanical Analysis in Shot Put Techniques of International Throwers

Authors: Satpal Yadav, Ashish Phulkar, Krishna K. Sahu

Abstract:

Aim: The research aims at doing a 3 Dimension biomechanical analysis in the shot put techniques of International throwers to evaluate the performance. Research Method: The researcher adopted the descriptive method and the data was subjected to calculate by using Pearson’s product moment correlation for the correlation of the biomechanical parameters with the performance of shot put throw. In all the analyses, the 5% critical level (p ≤ 0.05) was considered to indicate statistical significance. Research Sample: Eight (N=08) international shot putters using rotational/glide technique in male category was selected as subjects for the study. The researcher used the following methods and tools to obtain reliable measurements the instrument which was used for the purpose of present study namely the tesscorn slow-motion camera, specialized motion analyzer software, 7.260 kg Shot Put (for a male shot-putter) and steel tape. All measurement pertaining to the biomechanical variables was taken by the principal investigator so that data collected for the present study was considered reliable. Results: The finding of the study showed that negative significant relationship between the angular velocity right shoulder, acceleration distance at pre flight (-0.70), (-0.72) respectively were obtained, the angular displacement of knee, angular velocity right shoulder and acceleration distance at flight (0.81), (0.75) and (0.71) respectively were obtained, the angular velocity right shoulder and acceleration distance at transition phase (0.77), (0.79) respectively were obtained and angular displacement of knee, angular velocity right shoulder, release velocity shot, angle of release, height of release, projected distance and measured distance as the values (0.76), (0.77), (-0.83), (-0.79), (-0.77), (0.99) and (1.00) were found higher than the tabulated value at 0.05 level of significance. On the other hand, there exists an insignificant relationship between the performance of shot put and acceleration distance [m], angular displacement shot, C.G at release and horizontal release distance on the technique of shot put.

Keywords: biomechanics, analysis, shot put, international throwers

Procedia PDF Downloads 187
5573 Hash Based Block Matching for Digital Evidence Image Files from Forensic Software Tools

Authors: M. Kaya, M. Eris

Abstract:

Internet use, intelligent communication tools, and social media have all become an integral part of our daily life as a result of rapid developments in information technology. However, this widespread use increases crimes committed in the digital environment. Therefore, digital forensics, dealing with various crimes committed in digital environment, has become an important research topic. It is in the research scope of digital forensics to investigate digital evidences such as computer, cell phone, hard disk, DVD, etc. and to report whether it contains any crime related elements. There are many software and hardware tools developed for use in the digital evidence acquisition process. Today, the most widely used digital evidence investigation tools are based on the principle of finding all the data taken place in digital evidence that is matched with specified criteria and presenting it to the investigator (e.g. text files, files starting with letter A, etc.). Then, digital forensics experts carry out data analysis to figure out whether these data are related to a potential crime. Examination of a 1 TB hard disk may take hours or even days, depending on the expertise and experience of the examiner. In addition, it depends on examiner’s experience, and may change overall result involving in different cases overlooked. In this study, a hash-based matching and digital evidence evaluation method is proposed, and it is aimed to automatically classify the evidence containing criminal elements, thereby shortening the time of the digital evidence examination process and preventing human errors.

Keywords: block matching, digital evidence, hash list, evaluation of digital evidence

Procedia PDF Downloads 255
5572 Enhancing Quality Management Systems through Automated Controls and Neural Networks

Authors: Shara Toibayeva, Irbulat Utepbergenov, Lyazzat Issabekova, Aidana Bodesova

Abstract:

The article discusses the importance of quality assessment as a strategic tool in business and emphasizes the significance of the effectiveness of quality management systems (QMS) for enterprises. The evaluation of these systems takes into account the specificity of quality indicators, the multilevel nature of the system, and the need for optimal selection of the number of indicators and evaluation of the system state, which is critical for making rational management decisions. Methods and models of automated enterprise quality management are proposed, including an intelligent automated quality management system integrated with the Management Information and Control System. These systems make it possible to automate the implementation and support of QMS, increasing the validity, efficiency, and effectiveness of management decisions by automating the functions performed by decision makers and personnel. The paper also emphasizes the use of recurrent neural networks to improve automated quality management. Recurrent neural networks (RNNs) are used to analyze and process sequences of data, which is particularly useful in the context of document quality assessment and non-conformance detection in quality management systems. These networks are able to account for temporal dependencies and complex relationships between different data elements, which improves the accuracy and efficiency of automated decisions. The project was supported by a grant from the Ministry of Education and Science of the Republic of Kazakhstan under the Zhas Galym project No. AR 13268939, dedicated to research and development of digital technologies to ensure consistency of QMS regulatory documents.

Keywords: automated control system, quality management, document structure, formal language

Procedia PDF Downloads 39
5571 Development of a Flexible Lora-Based Wireless Sensory System for Long-Time Health Monitoring of Civil Structures

Authors: Hui Zhang, Sherif Beskhyroun

Abstract:

In this study, a highly flexible LoRa-Based wireless sensing system was used to assess the strain state performance of building structures. The system was developed to address the local damage limitation of structural health monitoring (SHM) systems. The system is part of an intelligent SHM system designed to monitor, collect and transmit strain changes in key structural components. The main purpose of the wireless sensor system is to reduce the development and installation costs, and reduce the power consumption of the system, so as to achieve long-time monitoring. The highly stretchable flexible strain gauge is mounted on the surface of the structure and is waterproof, heat resistant, and low temperature resistant, greatly reducing the installation and maintenance costs of the sensor. The system was also developed with the aim of using LoRa wireless communication technology to achieve both low power consumption and long-distance transmission, therefore solving the problem of large-scale deployment of sensors to cover more areas in large structures. In the long-term monitoring of the building structure, the system shows very high performance, very low actual power consumption, and wireless transmission stability. The results show that the developed system has a high resolution, sensitivity, and high possibility of long-term monitoring.

Keywords: LoRa, SHM system, strain measurement, civil structures, flexible sensing system

Procedia PDF Downloads 103
5570 Real-Space Mapping of Surface Trap States in Cigse Nanocrystals Using 4D Electron Microscopy

Authors: Riya Bose, Ashok Bera, Manas R. Parida, Anirudhha Adhikari, Basamat S. Shaheen, Erkki Alarousu, Jingya Sun, Tom Wu, Osman M. Bakr, Omar F. Mohammed

Abstract:

This work reports visualization of charge carrier dynamics on the surface of copper indium gallium selenide (CIGSe) nanocrystals in real space and time using four-dimensional scanning ultrafast electron microscopy (4D S-UEM) and correlates it with the optoelectronic properties of the nanocrystals. The surface of the nanocrystals plays a key role in controlling their applicability for light emitting and light harvesting purposes. Typically for quaternary systems like CIGSe, which have many desirable attributes to be used for optoelectronic applications, relative abundance of surface trap states acting as non-radiative recombination centre for charge carriers remains as a major bottleneck preventing further advancements and commercial exploitation of these nanocrystals devices. Though ultrafast spectroscopic techniques allow determining the presence of picosecond carrier trapping channels, because of relative larger penetration depth of the laser beam, only information mainly from the bulk of the nanocrystals is obtained. Selective mapping of such ultrafast dynamical processes on the surfaces of nanocrystals remains as a key challenge, so far out of reach of purely optical probing time-resolved laser techniques. In S-UEM, the optical pulse generated from a femtosecond (fs) laser system is used to generate electron packets from the tip of the scanning electron microscope, instead of the continuous electron beam used in the conventional setup. This pulse is synchronized with another optical excitation pulse that initiates carrier dynamics in the sample. The principle of S-UEM is to detect the secondary electrons (SEs) generated in the sample, which is emitted from the first few nanometers of the top surface. Constructed at different time delays between the optical and electron pulses, these SE images give direct and precise information about the carrier dynamics on the surface of the material of interest. In this work, we report selective mapping of surface dynamics in real space and time of CIGSe nanocrystals applying 4D S-UEM. We show that the trap states can be considerably passivated by ZnS shelling of the nanocrystals, and the carrier dynamics can be significantly slowed down. We also compared and discussed the S-UEM kinetics with the carrier dynamics obtained from conventional ultrafast time-resolved techniques. Additionally, a direct effect of the state trap removal can be observed in the enhanced photoresponse of the nanocrystals after shelling. Direct observation of surface dynamics will not only provide a profound understanding of the photo-physical mechanisms on nanocrystals’ surfaces but also enable to unlock their full potential for light emitting and harvesting applications.

Keywords: 4D scanning ultrafast microscopy, charge carrier dynamics, nanocrystals, optoelectronics, surface passivation, trap states

Procedia PDF Downloads 295
5569 Variance-Aware Routing and Authentication Scheme for Harvesting Data in Cloud-Centric Wireless Sensor Networks

Authors: Olakanmi Oladayo Olufemi, Bamifewe Olusegun James, Badmus Yaya Opeyemi, Adegoke Kayode

Abstract:

The wireless sensor network (WSN) has made a significant contribution to the emergence of various intelligent services or cloud-based applications. Most of the time, these data are stored on a cloud platform for efficient management and sharing among different services or users. However, the sensitivity of the data makes them prone to various confidentiality and performance-related attacks during and after harvesting. Various security schemes have been developed to ensure the integrity and confidentiality of the WSNs' data. However, their specificity towards particular attacks and the resource constraint and heterogeneity of WSNs make most of these schemes imperfect. In this paper, we propose a secure variance-aware routing and authentication scheme with two-tier verification to collect, share, and manage WSN data. The scheme is capable of classifying WSN into different subnets, detecting any attempt of wormhole and black hole attack during harvesting, and enforcing access control on the harvested data stored in the cloud. The results of the analysis showed that the proposed scheme has more security functionalities than other related schemes, solves most of the WSNs and cloud security issues, prevents wormhole and black hole attacks, identifies the attackers during data harvesting, and enforces access control on the harvested data stored in the cloud at low computational, storage, and communication overheads.

Keywords: data block, heterogeneous IoT network, data harvesting, wormhole attack, blackhole attack access control

Procedia PDF Downloads 84
5568 Quality Evaluation of Grape Seed Oils of the Ionian Islands Based on GC-MS and Other Spectroscopic Techniques

Authors: I. Oikonomou, I. Lappa, D. Daferera, C. Kanakis, L. Kiokakis, K. Skordilis, A. Avramouli, E. Kalli, C. Pappas, P. A. Tarantilis, E. Skotti

Abstract:

Grape seeds are waste products of wineries and often referred to as an important agricultural and industrial waste product with the potential to be used in pharmaceutical, food, and cosmetic applications. In this study, grape seed oil from traditional Ionian varieties was examined for the determination of the quality and the characteristics of each variety. Initially, the fatty acid methyl ester (FAME) profiles were analyzed using Gas Chromatography-Mass Spectrometry, after transesterification. Furthermore, other quality parameters of the grape seed oils were determined by Spectroscopy techniques, UV-Vis and Raman included. Moreover, the antioxidant capacity of the oil was measured by 2,2'-azino-bis-3-ethylbenzothiazoline-6-sulfonic acid (ABTS) and 2,2-Diphenyl-1-picrylhydrazyl (DPPH) assays and their antioxidant capacity expressed in Trolox equivalents. K and ΔΚ indices were measured in 232, 268, 270 nm, as an oil quality index. The results indicate that the air-dried grape seed total oil content ranged from 5.26 to 8.77% w/w, which is in accordance with the other grape seed varieties tested in similar studies. The composition of grape seed oil is predominated with linoleic and oleic fatty acids, with the linoleic fatty acid ranging from 53.68 to 69.95% and both the linoleic and oleic fatty acids totaling 78-82% of FAMEs, which is analogous to the fatty acid composition of safflower oil. The antioxidant assays ABTS and DPPH scored high, exhibiting that the oils have potential in the cosmetic and culinary businesses. Above that, our results demonstrate that Ionian grape seed oils have prospects that can go further than cosmetic or culinary use, into the pharmaceuticals industry. Finally, the reclamation of grape seeds from wineries waste stream is in accordance with the bio-economy strategic framework and contributes to environmental protection.

Keywords: antioxidant capacity, fatty acid methyl esters, grape seed oil, GC-MS

Procedia PDF Downloads 204
5567 Statistical Comparison of Ensemble Based Storm Surge Forecasting Models

Authors: Amin Salighehdar, Ziwen Ye, Mingzhe Liu, Ionut Florescu, Alan F. Blumberg

Abstract:

Storm surge is an abnormal water level caused by a storm. Accurate prediction of a storm surge is a challenging problem. Researchers developed various ensemble modeling techniques to combine several individual forecasts to produce an overall presumably better forecast. There exist some simple ensemble modeling techniques in literature. For instance, Model Output Statistics (MOS), and running mean-bias removal are widely used techniques in storm surge prediction domain. However, these methods have some drawbacks. For instance, MOS is based on multiple linear regression and it needs a long period of training data. To overcome the shortcomings of these simple methods, researchers propose some advanced methods. For instance, ENSURF (Ensemble SURge Forecast) is a multi-model application for sea level forecast. This application creates a better forecast of sea level using a combination of several instances of the Bayesian Model Averaging (BMA). An ensemble dressing method is based on identifying best member forecast and using it for prediction. Our contribution in this paper can be summarized as follows. First, we investigate whether the ensemble models perform better than any single forecast. Therefore, we need to identify the single best forecast. We present a methodology based on a simple Bayesian selection method to select the best single forecast. Second, we present several new and simple ways to construct ensemble models. We use correlation and standard deviation as weights in combining different forecast models. Third, we use these ensembles and compare with several existing models in literature to forecast storm surge level. We then investigate whether developing a complex ensemble model is indeed needed. To achieve this goal, we use a simple average (one of the simplest and widely used ensemble model) as benchmark. Predicting the peak level of Surge during a storm as well as the precise time at which this peak level takes place is crucial, thus we develop a statistical platform to compare the performance of various ensemble methods. This statistical analysis is based on root mean square error of the ensemble forecast during the testing period and on the magnitude and timing of the forecasted peak surge compared to the actual time and peak. In this work, we analyze four hurricanes: hurricanes Irene and Lee in 2011, hurricane Sandy in 2012, and hurricane Joaquin in 2015. Since hurricane Irene developed at the end of August 2011 and hurricane Lee started just after Irene at the beginning of September 2011, in this study we consider them as a single contiguous hurricane event. The data set used for this study is generated by the New York Harbor Observing and Prediction System (NYHOPS). We find that even the simplest possible way of creating an ensemble produces results superior to any single forecast. We also show that the ensemble models we propose generally have better performance compared to the simple average ensemble technique.

Keywords: Bayesian learning, ensemble model, statistical analysis, storm surge prediction

Procedia PDF Downloads 309
5566 Geo-Collaboration Model between a City and Its Inhabitants to Develop Complementary Solutions for Better Household Waste Collection

Authors: Abdessalam Hijab, Hafida Boulekbache, Eric Henry

Abstract:

According to several research studies, the city as a whole is a complex, spatially organized system; its modeling must take into account several factors, socio-economic, and political, or geographical, acting at multiple scales of observation according to varied temporalities. Sustainable management and protection of the environment in this complex system require significant human and technical investment, particularly for monitoring and maintenance. The objective of this paper is to propose an intelligent approach based on the coupling of Geographic Information System (GIS) and Information and Communications Technology (ICT) tools in order to integrate the inhabitants in the processes of sustainable management and protection of the urban environment, specifically in the processes of household waste collection in urban areas. We are discussing a collaborative 'city/inhabitant' space. Indeed, it is a geo-collaborative approach, based on the spatialization and real-time geo-localization of topological and multimedia data taken by the 'active' inhabitant, in the form of geo-localized alerts related to household waste issues in their city. Our proposal provides a good understanding of the extent to which civil society (inhabitants) can help and contribute to the development of complementary solutions for the collection of household waste and the protection of the urban environment. Moreover, it allows the inhabitant to contribute to the enrichment of a data bank for future uses. Our geo-collaborative model will be tested in the Lamkansa sampling district of the city of Casablanca in Morocco.

Keywords: geographic information system, GIS, information and communications technology, ICT, geo-collaboration, inhabitants, city

Procedia PDF Downloads 116
5565 Driver Take-Over Time When Resuming Control from Highly Automated Driving in Truck Platooning Scenarios

Authors: Bo Zhang, Ellen S. Wilschut, Dehlia M. C. Willemsen, Marieke H. Martens

Abstract:

With the rapid development of intelligent transportation systems, automated platooning of trucks is drawing increasing interest for its beneficial effects on safety, energy consumption and traffic flow efficiency. Nevertheless, one major challenge lies in the safe transition of control from the automated system back to the human drivers, especially when they have been inattentive after a long period of highly automated driving. In this study, we investigated driver take-over time after a system initiated request to leave the platooning system Virtual Tow Bar in a non-critical scenario. 22 professional truck drivers participated in the truck driving simulator experiment, and each was instructed to drive under three experimental conditions before the presentation of the take-over request (TOR): driver ready (drivers were instructed to monitor the road constantly), driver not-ready (drivers were provided with a tablet) and eye-shut. The results showed significantly longer take-over time in both driver not-ready and eye-shut conditions compared with the driver ready condition. Further analysis revealed hand movement time as the main factor causing long response time in the driver not-ready condition, while in the eye-shut condition, gaze reaction time also influenced the total take-over time largely. In addition to comparing the means, large individual differences can be found especially in two driver, not attentive conditions. The importance of a personalized driver readiness predictor for a safe transition is concluded.

Keywords: driving simulation, highly automated driving, take-over time, transition of control, truck platooning

Procedia PDF Downloads 253
5564 Hyperspectral Imaging and Nonlinear Fukunaga-Koontz Transform Based Food Inspection

Authors: Hamidullah Binol, Abdullah Bal

Abstract:

Nowadays, food safety is a great public concern; therefore, robust and effective techniques are required for detecting the safety situation of goods. Hyperspectral Imaging (HSI) is an attractive material for researchers to inspect food quality and safety estimation such as meat quality assessment, automated poultry carcass inspection, quality evaluation of fish, bruise detection of apples, quality analysis and grading of citrus fruits, bruise detection of strawberry, visualization of sugar distribution of melons, measuring ripening of tomatoes, defect detection of pickling cucumber, and classification of wheat kernels. HSI can be used to concurrently collect large amounts of spatial and spectral data on the objects being observed. This technique yields with exceptional detection skills, which otherwise cannot be achieved with either imaging or spectroscopy alone. This paper presents a nonlinear technique based on kernel Fukunaga-Koontz transform (KFKT) for detection of fat content in ground meat using HSI. The KFKT which is the nonlinear version of FKT is one of the most effective techniques for solving problems involving two-pattern nature. The conventional FKT method has been improved with kernel machines for increasing the nonlinear discrimination ability and capturing higher order of statistics of data. The proposed approach in this paper aims to segment the fat content of the ground meat by regarding the fat as target class which is tried to be separated from the remaining classes (as clutter). We have applied the KFKT on visible and nearinfrared (VNIR) hyperspectral images of ground meat to determine fat percentage. The experimental studies indicate that the proposed technique produces high detection performance for fat ratio in ground meat.

Keywords: food (ground meat) inspection, Fukunaga-Koontz transform, hyperspectral imaging, kernel methods

Procedia PDF Downloads 431
5563 Elucidating Microstructural Evolution Mechanisms in Tungsten via Layerwise Rolling in Additive Manufacturing: An Integrated Simulation and Experimental Approach

Authors: Sadman Durlov, Aditya Ganesh-Ram, Hamidreza Hekmatjou, Md Najmus Salehin, Nora Shayesteh Ameri

Abstract:

In the field of additive manufacturing, tungsten stands out for its exceptional resistance to high temperatures, making it an ideal candidate for use in extreme conditions. However, its inherent brittleness and vulnerability to thermal cracking pose significant challenges to its manufacturability. This study explores the microstructural evolution of tungsten processed through layer-wise rolling in laser powder bed fusion additive manufacturing, utilizing a comprehensive approach that combines advanced simulation techniques with empirical research. We aim to uncover the complex processes of plastic deformation and microstructural transformations, with a particular focus on the dynamics of grain size, boundary evolution, and phase distribution. Our methodology employs a combination of simulation and experimental data, allowing for a detailed comparison that elucidates the key mechanisms influencing microstructural alterations during the rolling process. This approach facilitates a deeper understanding of the material's behavior under additive manufacturing conditions, specifically in terms of deformation and recrystallization. The insights derived from this research not only deepen our theoretical knowledge but also provide actionable strategies for refining manufacturing parameters to improve the tungsten components' mechanical properties and functional performance. By integrating simulation with practical experimentation, this study significantly enhances the field of materials science, offering a robust framework for the development of durable materials suited for challenging operational environments. Our findings pave the way for optimizing additive manufacturing techniques and expanding the use of tungsten across various demanding sectors.

Keywords: additive manufacturing, layer wise rolling, refractory materials, in-situ microstructure modifications

Procedia PDF Downloads 61
5562 Information Visualization Methods Applied to Nanostructured Biosensors

Authors: Osvaldo N. Oliveira Jr.

Abstract:

The control of molecular architecture inherent in some experimental methods to produce nanostructured films has had great impact on devices of various types, including sensors and biosensors. The self-assembly monolayers (SAMs) and the electrostatic layer-by-layer (LbL) techniques, for example, are now routinely used to produce tailored architectures for biosensing where biomolecules are immobilized with long-lasting preserved activity. Enzymes, antigens, antibodies, peptides and many other molecules serve as the molecular recognition elements for detecting an equally wide variety of analytes. The principles of detection are also varied, including electrochemical methods, fluorescence spectroscopy and impedance spectroscopy. In this presentation an overview will be provided of biosensors made with nanostructured films to detect antibodies associated with tropical diseases and HIV, in addition to detection of analytes of medical interest such as cholesterol and triglycerides. Because large amounts of data are generated in the biosensing experiments, use has been made of computational and statistical methods to optimize performance. Multidimensional projection techniques such as Sammon´s mapping have been shown more efficient than traditional multivariate statistical analysis in identifying small concentrations of anti-HIV antibodies and for distinguishing between blood serum samples of animals infected with two tropical diseases, namely Chagas´ disease and Leishmaniasis. Optimization of biosensing may include a combination of another information visualization method, the Parallel Coordinate technique, with artificial intelligence methods in order to identify the most suitable frequencies for reaching higher sensitivity using impedance spectroscopy. Also discussed will be the possible convergence of technologies, through which machine learning and other computational methods may be used to treat data from biosensors within an expert system for clinical diagnosis.

Keywords: clinical diagnosis, information visualization, nanostructured films, layer-by-layer technique

Procedia PDF Downloads 337
5561 Development and Implementation of a Business Technology Program Based on Techniques for Reusing Water in a Colombian Company

Authors: Miguel A. Jimenez Barros, Elyn L. Solano Charris, Luis E. Ramirez, Lauren Castro Bolano, Carlos Torres Barreto, Juliana Morales Cubillo

Abstract:

This project sought to mitigate the high levels of water consumption in industrial processes in accordance with the water-rationing plan promoted at national and international level due to the water consumption projections published by the United Nations. Water consumption has three main uses, municipal (common use), agricultural and industrial where the latter consumes a minimum percentage (around 20% of the total consumption). Awareness on world water scarcity, a Colombian company responsible for generation of massive consumption products, decided to implement politics and techniques for water treatment, recycling, and reuse. The project consisted in a business technology program that permits a better use of wastewater caused by production operations. This approach reduces the potable water consumption, generates better conditions of water in the sewage dumps, generates a positive environmental impact for the region, and is a reference model in national and international levels. In order to achieve the objective, a process flow diagram was used in order to define the industrial processes that required potable water. This strategy allowed the industry to determine a water reuse plan at the operational level without affecting the requirements associated with the manufacturing process and even more, to support the activities developed in administrative buildings. Afterwards, the company made an evaluation and selection of the chemical and biological processes required for water reuse, in compliance with the Colombian Law. The implementation of the business technology program optimized the water use and recirculation rate up to 70%, accomplishing an important reduction of the regional environmental impact.

Keywords: bio-reactor, potable water, reverse osmosis, water treatment

Procedia PDF Downloads 235
5560 Rejuvenation of Aged Kraft-Cellulose Insulating Paper Used in Transformers

Authors: Y. Jeon, A. Bissessur, J. Lin, P. Ndungu

Abstract:

Most transformers employ the usage of cellulose paper, which has been chemically modified through the Kraft process that acts as an effective insulator. Cellulose ageing and oil degradation are directly linked to fouling of the transformer and accumulation of large quantities of waste insulating paper. In addition to technical difficulties, this proves costly for power utilities to deal with. Currently there are no cost effective method for the rejuvenation of cellulose paper that has been documented nor proposed, since renewal of used insulating paper is implemented as the best option. This study proposes and contrasts different rejuvenation methods of accelerated aged cellulose insulating paper by chemical and bio-bleaching processes. Of the three bleaching methods investigated, two are, conventional chlorine-based sodium hypochlorite (m/v), and chlorine-free hydrogen peroxide (v/v), whilst the third is a bio-bleaching technique that uses a bacterium isolate, Acinetobacter strain V2. Through chemical bleaching, varying the strengths of the bleaching reagents at 0.3 %, 0.6 %, 0.9 %, 1.2 %, 1.5 % and 1.8 % over 4 hrs. were analyzed. Bio-bleaching implemented a bacterium isolate, Acinetobacter strain V2, to bleach the aged Kraft paper over 4 hrs. The determination of the amount of alpha cellulose, degree of polymerization and viscosity carried out on Kraft-cellulose insulating paper before and after bleaching. Overall the investigated techniques of chemical and bio-bleaching were successful and effective in treating degraded and accelerated aged Kraft-cellulose insulating paper, however, to varying extents. Optimum conditions for chemical bleaching were attained at bleaching strengths of 1.2 % (m/v) NaOCl and 1.5 % (v/v) H2O2 yielding alpha cellulose contents of 82.4 % and 80.7 % and degree of polymerizations of 613 and 616 respectively. Bio-bleaching using Acinetobacter strain V2 proved to be the superior technique with alpha cellulose levels of 89.0 % and a degree of polymerization of 620. Chemical bleaching techniques require careful and controlled clean-up treatments as it is chlorine and hydrogen peroxide based while bio-bleaching is an extremely eco-friendly technique.

Keywords: alpha cellulose, bio-bleaching, degree of polymerization, Kraft-cellulose insulating paper, transformer, viscosity

Procedia PDF Downloads 270
5559 The Prevalence of Organized Retail Crime in Riyadh, Saudi Arabia

Authors: Saleh Dabil

Abstract:

This study investigates the level of existence of organized retail crime in supermarkets of Riyadh, Saudi Arabia. The store managers, security managers and general employees were asked about the types of retail crimes occur in the stores. Three independent variables were related to the report of organized retail theft. The independent variables are: (1) the supermarket profile (volume, location, standard and type of the store), (2) the social physical environment of the store (maintenance, cleanness and overall organizational cooperation), (3) the security techniques and loss prevention electronics techniques used. The theoretical framework of this study based on the social disorganization theory. This study concluded that the organized retail theft, in specific, organized theft is moderately apparent in Riyadh stores. The general result showed that the environment of the stores has an effect on the prevalence of organized retail theft with relation to the gender of thieves, age groups, working shift, type of stolen items as well as the number of thieves in one case. Among other reasons, some factors of the organized theft are: economic pressure of customers based on the location of the store. The dealing of theft also was investigated to have a clear picture of stores dealing with organized retail theft. The result showed that mostly, thieves sent without any action and sometimes given written warning. Very few cases dealt with by police. There are other factors in the study can be looked up in the text. This study suggests solving the problem of organized theft; first is ‘the well distributing of the duties and responsibilities between the employees especially for security purposes’. Second is ‘installation of strong security system’ and ‘making well-designed store layout’. Third is ‘giving training for general employees’ and ‘to give periodically security skills training of employees’. There are other suggestions in the study can be looked up in the text.

Keywords: organized crime, retail, theft, loss prevention, store environment

Procedia PDF Downloads 196
5558 The DAQ Debugger for iFDAQ of the COMPASS Experiment

Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius

Abstract:

In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.

Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework

Procedia PDF Downloads 284
5557 Lightweight Ceramics from Clay and Ground Corncobs

Authors: N.Quaranta, M. Caligaris, R. Varoli, A. Cristobal, M. Unsen, H. López

Abstract:

Corncobs are agricultural wastes and they can be used as fuel or as raw material in different industrial processes like cement manufacture, contaminant adsorption, chemical compound synthesis, etc. The aim of this work is to characterize this waste and analyze the feasibility of its use as a pore-forming material in the manufacture of lightweight ceramics for the civil construction industry. The characterization of raw materials is carried out by using various techniques: electron diffraction analysis X-ray, differential and gravimetric thermal analyses, FTIR spectroscopy, ecotoxicity evaluation, among others. The ground corncobs, particle size less than 2 mm, are mixed with clay up to 30% in volume and shaped by uniaxial pressure of 25 MPa, with 6% humidity, in moulds of 70mm x 40mm x 18mm. Then the green bodies are heat treated at 950°C for two hours following the treatment curves used in ceramic industry. The ceramic probes are characterized by several techniques: density, porosity and water absorption, permanent volumetric variation, loss on ignition, microscopies analysis, and mechanical properties. DTA-TGA analysis of corncobs shows in the range 20°-250°C a small loss in TGA curve and exothermic peaks at 250°-500°C. FTIR spectrum of the corncobs sample shows the characteristic pattern of this kind of organic matter with stretching vibration bands of adsorbed water, methyl groups, C–O and C–C bonds, and the complex form of the cellulose and hemicellulose glycosidic bonds. The obtained ceramic bodies present external good characteristics without loose edges and adequate properties for the market requirements. The porosity values of the sintered pieces are higher than those of the reference sample without waste addition. The results generally indicate that it is possible to use corncobs as porosity former in ceramic bodies without modifying the usual sintering temperatures employed in the industry.

Keywords: ceramic industry, biomass, recycling, hemicellulose glycosidic bonds

Procedia PDF Downloads 405