Search results for: automated microscope
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 1610

Search results for: automated microscope

1310 Management Potentialities Of Rice Blast Disease Caused By Magnaporthe Grisae Using New Nanofungicides Derived From Chitosan

Authors: Abdulaziz Bashir Kutawa1, 2, *, Khairulmazmi Ahmad 1, 3, Mohd Zobir Hussein 4, Asgar Ali 5, * Mohd Aswad Abdul Wahab1, Amara Rafi3, Mahesh Tiran Gunasena1, 6, Muhammad Ziaur Rahman1, 7, Md Imam Hossain1, And Syazwan Afif Mohd Zobir1

Abstract:

Various abiotic and biotic stresses have an impact on rice production all around the world. The most serious and prevalent disease in rice plants, known as rice blast, is one of the major obstacles to the production of rice. It is one of the diseases that has the greatest negative effects on rice farming globally, the disease is caused by a fungus called Magnaporthe grisae. Since nanoparticles were shown to have an inhibitory impact on certain types of fungus, nanotechnology is a novel notion to enhance agriculture by battling plant diseases. Utilizing nanocarrier systems enables the active chemicals to be absorbed, attached, and encapsulated to produce efficient nanodelivery formulations. The objectives of this research work were to determine the efficacy and mode of action of the nanofungicides (in-vitro) and in field conditions (in-vivo). Ionic gelation method was used in the development of the nanofungicides. Using the poisoned media method, the synthesized agronanofungicides' in-vitro antifungal activity was assessed against M. grisae. The potato dextrose agar (PDA) was amended in several concentrations; 0.001, 0.005, 0.01, 0.025, 0.05, 0.1, 0.15, 0.20, 0.25, 0.30, and 0.35 ppm for the nanofungicides. Medium with the only solvent served as a control. Every day, mycelial growth was measured, and PIRG (percentage inhibition of radial growth) was also computed. Every day, mycelial growth was measured, and PIRG (percentage inhibition of radial growth) was also computed. Based on the results of the zone of inhibition, the chitosan-hexaconazole agronanofungicide (2g/mL) was the most effective fungicide to inhibit the growth of the fungus with 100% inhibition at 0.2, 0.25, 0.30, and 0.35 ppm, respectively. Then followed by carbendazim analytical fungicide that inhibited the growth of the fungus (100%) at 5, 10, 25, 50, and 100 ppm, respectively. The least were found to be propiconazole and basamid fungicides with 100% inhibition only at 100 ppm. The scanning electron microscope (SEM), confocal laser scanning microscope (CLSM), and transmission electron microscope (TEM) were used to study the mechanisms of action of the M. grisae fungal cells. The results showed that both carbendazim, chitosan-hexaconazole, and HXE were found to be the most effective fungicides in disrupting the mycelia of the fungus, and internal structures of the fungal cells. The results of the field assessment showed that the CHDEN treatment (5g/L, double dosage) was found to be the most effective fungicide to reduce the intensity of the rice blast disease with DSI of 17.56%, lesion length (0.43 cm), DR of 82.44%, AUDPC of 260.54 Unit2, and PI of 65.33%, respectively. The least treatment was found to be chitosan-hexaconazole-dazomet (2.5g/L, MIC). The usage of CHDEN and CHEN nanofungicides will significantly assist in lessening the severity of rice blast in the fields, increasing output and profit for rice farmers.

Keywords: chitosan, hexaconazole, disease incidence, and magnaporthe grisae

Procedia PDF Downloads 41
1309 Innovative Technologies for Aeration and Feeding of Fish in Aquaculture with Minimal Impact on the Environment

Authors: Vasile Caunii, Andreea D. Serban, Mihaela Ivancia

Abstract:

The paper presents a new approach in terms of the circular economy of technologies for feeding and aeration of accumulations and water basins for fish farming and aquaculture. Because fish is and will be one of the main foods on the planet, the use of bio-eco-technologies is a priority for all producers. The technologies proposed in the paper want to reduce by a substantial percentage the costs of operation of ponds and water accumulation, using non-polluting technologies with minimal impact on the environment. The paper proposes two innovative, intelligent systems, fully automated that use a common platform, completely eco-friendly. One system is intended to aerate the water of the fish pond, and the second is intended to feed the fish by dispersing an optimal amount of fodder, depending on population size, age and habits. Both systems use a floating platform, regenerative energy sources, are equipped with intelligent and innovative systems, and in addition to fully automated operation, significantly reduce the costs of aerating water accumulations (natural or artificial) and feeding fish. The intelligent system used for feeding, in addition, to reduce operating costs, optimizes the amount of food, thus preventing water pollution and the development of bacteria, microorganisms. The advantages of the systems are: increasing the yield of fish production, these are green installations, with zero pollutant emissions, can be arranged anywhere on the water surface, depending on the user's needs, can operate autonomously or remotely controlled, if there is a component failure, the system provides the operator with accurate data on the issue, significantly reducing maintenance costs, transmit data about the water physical and chemical parameters.

Keywords: bio-eco-technologies, economy, environment, fish

Procedia PDF Downloads 117
1308 Cu3SbS3 as Anode Material for Sodium Batteries

Authors: Atef Y. Shenouda, Fei Xu

Abstract:

Cu₃SbS₃ (CAS) was synthesized by direct solid-state reaction from elementary Cu, Sb, & S and hydrothermal reaction using thioacetamide (TAM). Crystal structure and morphology for the prepared phases of Cu₃SbS₃ were studied via X-ray diffraction (XRD) and field emission scanning electron microscope (FESEM). The band gap energies are 2 and 2.2 eV for the prepared samples. The two samples are as anode for Na ion storage. They show high initial capacity to 490 mAh/g. Na cell prepared from TAM sample shows 280 mAh/g after 25 cycles vs. 60 mAh/g for elemental sample.

Keywords: Cu3SbS3, sodium batteries, thioacetamide, sulphur sources

Procedia PDF Downloads 38
1307 High-Frequency Acoustic Microscopy Imaging of Pellet/Cladding Interface in Nuclear Fuel Rods

Authors: H. Saikouk, D. Laux, Emmanuel Le Clézio, B. Lacroix, K. Audic, R. Largenton, E. Federici, G. Despaux

Abstract:

Pressurized Water Reactor (PWR) fuel rods are made of ceramic pellets (e.g. UO2 or (U,Pu) O2) assembled in a zirconium cladding tube. By design, an initial gap exists between these two elements. During irradiation, they both undergo transformations leading progressively to the closure of this gap. A local and non destructive examination of the pellet/cladding interface could constitute a useful help to identify the zones where the two materials are in contact, particularly at high burnups when a strong chemical bonding occurs under nominal operating conditions in PWR fuel rods. The evolution of the pellet/cladding bonding during irradiation is also an area of interest. In this context, the Institute of Electronic and Systems (IES- UMR CNRS 5214), in collaboration with the Alternative Energies and Atomic Energy Commission (CEA), is developing a high frequency acoustic microscope adapted to the control and imaging of the pellet/cladding interface with high resolution. Because the geometrical, chemical and mechanical nature of the contact interface is neither axially nor radially homogeneous, 2D images of this interface need to be acquired via this ultrasonic system with a highly performing processing signal and by means of controlled displacement of the sample rod along both its axis and its circumference. Modeling the multi-layer system (water, cladding, fuel etc.) is necessary in this present study and aims to take into account all the parameters that have an influence on the resolution of the acquired images. The first prototype of this microscope and the first results of the visualization of the inner face of the cladding will be presented in a poster in order to highlight the potentials of the system, whose final objective is to be introduced in the existing bench MEGAFOX dedicated to the non-destructive examination of irradiated fuel rods at LECA-STAR facility in CEA-Cadarache.

Keywords: high-frequency acoustic microscopy, multi-layer model, non-destructive testing, nuclear fuel rod, pellet/cladding interface, signal processing

Procedia PDF Downloads 163
1306 Robust Segmentation of Salient Features in Automatic Breast Ultrasound (ABUS) Images

Authors: Lamees Nasser, Yago Diez, Robert Martí, Joan Martí, Ibrahim Sadek

Abstract:

Automated 3D breast ultrasound (ABUS) screening is a novel modality in medical imaging because of its common characteristics shared with other ultrasound modalities in addition to the three orthogonal planes (i.e., axial, sagittal, and coronal) that are useful in analysis of tumors. In the literature, few automatic approaches exist for typical tasks such as segmentation or registration. In this work, we deal with two problems concerning ABUS images: nipple and rib detection. Nipple and ribs are the most visible and salient features in ABUS images. Determining the nipple position plays a key role in some applications for example evaluation of registration results or lesion follow-up. We present a nipple detection algorithm based on color and shape of the nipple, besides an automatic approach to detect the ribs. In point of fact, rib detection is considered as one of the main stages in chest wall segmentation. This approach consists of four steps. First, images are normalized in order to minimize the intensity variability for a given set of regions within the same image or a set of images. Second, the normalized images are smoothed by using anisotropic diffusion filter. Next, the ribs are detected in each slice by analyzing the eigenvalues of the 3D Hessian matrix. Finally, a breast mask and a probability map of regions detected as ribs are used to remove false positives (FP). Qualitative and quantitative evaluation obtained from a total of 22 cases is performed. For all cases, the average and standard deviation of the root mean square error (RMSE) between manually annotated points placed on the rib surface and detected points on rib borders are 15.1188 mm and 14.7184 mm respectively.

Keywords: Automated 3D Breast Ultrasound, Eigenvalues of Hessian matrix, Nipple detection, Rib detection

Procedia PDF Downloads 306
1305 STML: Service Type-Checking Markup Language for Services of Web Components

Authors: Saqib Rasool, Adnan N. Mian

Abstract:

Web components are introduced as the latest standard of HTML5 for writing modular web interfaces for ensuring maintainability through the isolated scope of web components. Reusability can also be achieved by sharing plug-and-play web components that can be used as off-the-shelf components by other developers. A web component encapsulates all the required HTML, CSS and JavaScript code as a standalone package which must be imported for integrating a web component within an existing web interface. It is then followed by the integration of web component with the web services for dynamically populating its content. Since web components are reusable as off-the-shelf components, these must be equipped with some mechanism for ensuring their proper integration with web services. The consistency of a service behavior can be verified through type-checking. This is one of the popular solutions for improving the quality of code in many programming languages. However, HTML does not provide type checking as it is a markup language and not a programming language. The contribution of this work is to introduce a new extension of HTML called Service Type-checking Markup Language (STML) for adding support of type checking in HTML for JSON based REST services. STML can be used for defining the expected data types of response from JSON based REST services which will be used for populating the content within HTML elements of a web component. Although JSON has five data types viz. string, number, boolean, object and array but STML is made to supports only string, number and object. This is because of the fact that both object and array are considered as string, when populated in HTML elements. In order to define the data type of any HTML element, developer just needs to add the custom STML attributes of st-string, st-number and st-boolean for string, number and boolean respectively. These all annotations of STML are used by the developer who is writing a web component and it enables the other developers to use automated type-checking for ensuring the proper integration of their REST services with the same web component. Two utilities have been written for developers who are using STML based web components. One of these utilities is used for automated type-checking during the development phase. It uses the browser console for showing the error description if integrated web service is not returning the response with expected data type. The other utility is a Gulp based command line utility for removing the STML attributes before going in production. This ensures the delivery of STML free web pages in the production environment. Both of these utilities have been tested to perform type checking of REST services through STML based web components and results have confirmed the feasibility of evaluating service behavior only through HTML. Currently, STML is designed for automated type-checking of integrated REST services but it can be extended to introduce a complete service testing suite based on HTML only, and it will transform STML from Service Type-checking Markup Language to Service Testing Markup Language.

Keywords: REST, STML, type checking, web component

Procedia PDF Downloads 226
1304 Highly Responsive p-NiO/n-rGO Heterojunction Based Self-Powered UV Photodetectors

Authors: P. Joshna, Souvik Kundu

Abstract:

Detection of ultraviolet (UV) radiation is very important as it has exhibited a profound influence on humankind and other existences, including military equipment. In this work, a self-powered UV photodetector was reported based on oxides heterojunctions. The thin films of p-type nickel oxide (NiO) and n-type reduced graphene oxide (rGO) were used for the formation of p-n heterojunction. Low-Cost and low-temperature chemical synthesis was utilized to prepare the oxides, and the spin coating technique was employed to deposit those onto indium doped tin oxide (ITO) coated glass substrates. The top electrode platinum was deposited utilizing physical vapor evaporation technique. NiO offers strong UV absorption with high hole mobility, and rGO prevents the recombination rate by separating electrons out from the photogenerated carriers. Several structural characterizations such as x-ray diffraction, atomic force microscope, scanning electron microscope were used to study the materials crystallinity, microstructures, and surface roughness. On one side, the oxides were found to be polycrystalline in nature, and no secondary phases were present. On the other side, surface roughness was found to be low with no pit holes, which depicts the formation of high-quality oxides thin films. Whereas, x-ray photoelectron spectroscopy was employed to study the chemical compositions and oxidation structures. The electrical characterizations such as current-voltage and current response were also performed on the device to determine the responsivity, detectivity, and external quantum efficiency under dark and UV illumination. This p-n heterojunction device offered faster photoresponse and high on-off ratio under 365 nm UV light illumination of zero bias. The device based on the proposed architecture shows the efficacy of the oxides heterojunction for efficient UV photodetection under zero bias, which opens up a new path towards the development of self-powered photodetector for environment and health monitoring sector.

Keywords: chemical synthesis, oxides, photodetectors, spin coating

Procedia PDF Downloads 96
1303 Investigation of the Growth Kinetics of Phases in Ni–Sn System

Authors: Varun A Baheti, Sanjay Kashyap, Kamanio Chattopadhyay, Praveen Kumar, Aloke Paul

Abstract:

Ni–Sn system finds applications in the microelectronics industry, especially with respect to flip–chip or direct chip, attach technology. Here the region of interest is under bump metallization (UBM), and solder bump (Sn) interface due to the formation of brittle intermetallic phases there. Understanding the growth of these phases at UBM/Sn interface is important, as in many cases it controls the electro–mechanical properties of the product. Cu and Ni are the commonly used UBM materials. Cu is used for good bonding because of fast reaction with solder and Ni often acts as a diffusion barrier layer due to its inherently slower reaction kinetics with Sn–based solders. Investigation on the growth kinetics of phases in Ni–Sn system is reported in this study. Just for simplicity, Sn being major solder constituent is chosen. Ni–Sn electroplated diffusion couples are prepared by electroplating pure Sn on Ni substrate. Bulk diffusion couples prepared by the conventional method are also studied along with Ni–Sn electroplated diffusion couples. Diffusion couples are annealed for 25–1000 h at 50–215°C to study the phase evolutions and growth kinetics of various phases. The interdiffusion zone was analysed using field emission gun equipped scanning electron microscope (FE–SEM) for imaging. Indexing of selected area diffraction (SAD) patterns obtained from transmission electron microscope (TEM) and composition measurements done in electron probe micro−analyser (FE–EPMA) confirms the presence of various product phases grown across the interdiffusion zone. Time-dependent experiments indicate diffusion controlled growth of the product phase. The estimated activation energy in the temperature range 125–215°C for parabolic growth constants (and hence integrated interdiffusion coefficients) of the Ni₃Sn₄ phase shed light on the growth mechanism of the phase; whether its grain boundary controlled or lattice controlled diffusion. The location of the Kirkendall marker plane indicates that the Ni₃Sn₄ phase grows mainly by diffusion of Sn in the binary Ni–Sn system.

Keywords: diffusion, equilibrium phase, metastable phase, the Ni-Sn system

Procedia PDF Downloads 279
1302 Investigating the Effect of Using Amorphous Silica Ash Obtained from Rice Husk as a Partial Replacement of Ordinary Portland Cement on the Mechanical and Microstructure Properties of Cement Paste and Mortar

Authors: Aliyu Usman, Muhaammed Bello Ibrahim, Yusuf D. Amartey, Jibrin M. Kaura

Abstract:

This research is aimed at investigating the effect of using amorphous silica ash (ASA) obtained from rice husk as a partial replacement of ordinary Portland cement (OPC) on the mechanical and microstructure properties of cement paste and mortar. ASA was used in partial replacement of ordinary Portland cement in the following percentages 3 percent, 5 percent, 8 percent and 10 percent. These partial replacements were used to produce Cement-ASA paste and Cement-ASA mortar. ASA was found to contain all the major chemical compounds found in cement with the exception of alumina, which are SiO2 (91.5%), CaO (2.84%), Fe2O3 (1.96%), and loss on ignition (LOI) was found to be 9.18%. It also contains other minor oxides found in cement. Consistency of Cement-ASA paste was found to increase with increase in ASA replacement. Likewise, the setting time and soundness of the Cement-ASA paste also increases with increase in ASA replacements. The test on hardened mortar were destructive in nature which include flexural strength test on prismatic beam (40mm x 40mm x 160mm) at 2, 7, 14 and 28 days curing and compressive strength test on the cube size (40mm x 40mm, by using the auxiliary steel platens) at 2,7,14 and 28 days curing. The Cement-ASA mortar flexural and compressive strengths were found to be increasing with curing time and decreases with cement replacement by ASA. It was observed that 5 percent replacement of cement with ASA attained the highest strength for all the curing ages and all the percentage replacements attained the targeted compressive strength of 6N/mm2 for 28 days. There is an increase in the drying shrinkage of Cement-ASA mortar with curing time, it was also observed that the drying shrinkages for all the curing ages were greater than the control specimen all of which were greater than the code recommendation of less than 0.03%. The scanning electron microscope (SEM) was used to study the Cement-ASA mortar microstructure and to also look for hydration product and morphology.

Keywords: amorphous silica ash, cement mortar, cement paste, scanning electron microscope

Procedia PDF Downloads 407
1301 Morphology Analysis of Apple-Carrot Juice Treated by Manothermosonication (MTS) and High Temperature Short Time (HTST) Processes

Authors: Ozan Kahraman, Hao Feng

Abstract:

Manothermosonication (MTS), which consists of the simultaneous application of heat and ultrasound under moderate pressure (100-700 kPa), is one of the technologies which destroy microorganisms and inactivates enzymes. Transmission electron microscopy (TEM) is a microscopy technique in which a beam of electrons is transmitted through an ultra-thin specimen, interacting with the specimen as it passes through it. The environmental scanning electron microscope or ESEM is a scanning electron microscope (SEM) that allows for the option of collecting electron micrographs of specimens that are "wet," uncoated. These microscopy techniques allow us to observe the processing effects on the samples. This study was conducted to investigate the effects of MTS and HTST treatments on the morphology of apple-carrot juices by using TEM and ESEM microscopy. Apple-carrot juices treated with HTST (72 0C, 15 s), MTS 50 °C (60 s, 200 kPa), and MTS 60 °C (30 s, 200 kPa) were observed in both ESEM and TEM microscopy. For TEM analysis, a drop of the solution dispersed in fixative solution was put onto a Parafilm ® sheet. The copper coated side of the TEM sample holder grid was gently laid on top of the droplet and incubated for 15 min. A drop of a 7% uranyl acetate solution was added and held for 2 min. The grid was then removed from the droplet and allowed to dry at room temperature and presented into the TEM. For ESEM analysis, a critical point drying of the filters was performed using a critical point dryer (CPD) (Samdri PVT- 3D, Tousimis Research Corp., Rockville, MD, USA). After the CPD, each filter was mounted onto a stub and coated with gold/palladium with a sputter coater (Desk II TSC Denton Vacuum, Moorestown, NJ, USA). E.Coli O157:H7 cells on the filters were observed with an ESEM (Philips XL30 ESEM-FEG, FEI Co., Eindhoven, The Netherland). ESEM (Environmental Scanning Electron Microscopy) and TEM (Transmission Electron Microscopy) images showed extensive damage for the samples treated with MTS at 50 and 60 °C such as ruptured cells and breakage on cell membranes. The damage was increasing with increasing exposure time.

Keywords: MTS, HTST, ESEM, TEM, E.COLI O157:H7

Procedia PDF Downloads 252
1300 Immature Palm Tree Detection Using Morphological Filter for Palm Counting with High Resolution Satellite Image

Authors: Nur Nadhirah Rusyda Rosnan, Nursuhaili Najwa Masrol, Nurul Fatiha MD Nor, Mohammad Zafrullah Mohammad Salim, Sim Choon Cheak

Abstract:

Accurate inventories of oil palm planted areas are crucial for plantation management as this would impact the overall economy and production of oil. One of the technological advancements in the oil palm industry is semi-automated palm counting, which is replacing conventional manual palm counting via digitizing aerial imagery. Most of the semi-automated palm counting method that has been developed was limited to mature palms due to their ideal canopy size represented by satellite image. Therefore, immature palms were often left out since the size of the canopy is barely visible from satellite images. In this paper, an approach using a morphological filter and high-resolution satellite image is proposed to detect immature palm trees. This approach makes it possible to count the number of immature oil palm trees. The method begins with an erosion filter with an appropriate window size of 3m onto the high-resolution satellite image. The eroded image was further segmented using watershed segmentation to delineate immature palm tree regions. Then, local minimum detection was used because it is hypothesized that immature oil palm trees are located at the local minimum within an oil palm field setting in a grayscale image. The detection points generated from the local minimum are displaced to the center of the immature oil palm region and thinned. Only one detection point is left that represents a tree. The performance of the proposed method was evaluated on three subsets with slopes ranging from 0 to 20° and different planting designs, i.e., straight and terrace. The proposed method was able to achieve up to more than 90% accuracy when compared with the ground truth, with an overall F-measure score of up to 0.91.

Keywords: immature palm count, oil palm, precision agriculture, remote sensing

Procedia PDF Downloads 47
1299 A Philosophical Investigation into African Conceptions of Personhood in the Fourth Industrial Revolution

Authors: Sanelisiwe Ndlovu

Abstract:

Cities have become testbeds for automation and experimenting with artificial intelligence (AI) in managing urban services and public spaces. Smart Cities and AI systems are changing most human experiences from health and education to personal relations. For instance, in healthcare, social robots are being implemented as tools to assist patients. Similarly, in education, social robots are being used as tutors or co-learners to promote cognitive and affective outcomes. With that general picture in mind, one can now ask a further question about Smart Cities and artificial agents and their moral standing in the African context of personhood. There has been a wealth of literature on the topic of personhood; however, there is an absence of literature on African personhood in highly automated environments. Personhood in African philosophy is defined by the role one can and should play in the community. However, in today’s technologically advanced world, a risk is that machines become more capable of accomplishing tasks that humans would otherwise do. Further, on many African communitarian accounts, personhood and moral standing are associated with active relationality with the community. However, in the Smart City, human closeness is gradually diminishing. For instance, humans already do engage and identify with robotic entities, sometimes even romantically. The primary aim of this study is to investigate how African conceptions of personhood and community interact in a highly automated environment such as Smart Cities. Accordingly, this study lies in presenting a rarely discussed African perspective that emphasizes the necessity and the importance of relationality in handling Smart Cities and AI ethically. Thus, the proposed approach can be seen as the sub-Saharan African contribution to personhood and the growing AI debates, which takes the reality of the interconnectedness of society seriously. And it will also open up new opportunities to tackle old problems and use existing resources to confront new problems in the Fourth Industrial Revolution.

Keywords: smart city, artificial intelligence, personhood, community

Procedia PDF Downloads 174
1298 Field Emission Scanning Microscope Image Analysis for Porosity Characterization of Autoclaved Aerated Concrete

Authors: Venuka Kuruwita Arachchige Don, Mohamed Shaheen, Chris Goodier

Abstract:

Aerated autoclaved concrete (AAC) is known for its lightweight, easy handling, high thermal insulation, and extremely porous structure. Investigation of pore behavior in AAC is crucial for characterizing the material, standardizing design and production techniques, enhancing the mechanical, durability, and thermal performance, studying the effectiveness of protective measures, and analyzing the effects of weather conditions. The significant details of pores are complicated to observe with acknowledged accuracy. The High-resolution Field Emission Scanning Electron Microscope (FESEM) image analysis is a promising technique for investigating the pore behavior and density of AAC, which is adopted in this study. Mercury intrusion porosimeter and gas pycnometer were employed to characterize porosity distribution and density parameters. The analysis considered three different densities of AAC blocks and three layers in the altitude direction within each block. A set of understandings was presented to extract and analyze the details of pore shape, pore size, pore connectivity, and pore percentages from FESEM images of AAC. Average pore behavior outcomes per unit area were presented. Comparison of porosity distribution and density parameters revealed significant variations. FESEM imaging offered unparalleled insights into porosity behavior, surpassing the capabilities of other techniques. The analysis conducted from a multi-staged approach provides porosity percentage occupied by various pore categories, total porosity, variation of pore distribution compared to AAC densities and layers, number of two-dimensional and three-dimensional pores, variation of apparent and matrix densities concerning pore behaviors, variation of pore behavior with respect to aluminum content, and relationship among shape, diameter, connectivity, and percentage in each pore classification.

Keywords: autoclaved aerated concrete, density, imaging technique, microstructure, porosity behavior

Procedia PDF Downloads 28
1297 Understanding the Information in Principal Component Analysis of Raman Spectroscopic Data during Healing of Subcritical Calvarial Defects

Authors: Rafay Ahmed, Condon Lau

Abstract:

Bone healing is a complex and sequential process involving changes at the molecular level. Raman spectroscopy is a promising technique to study bone mineral and matrix environments simultaneously. In this study, subcritical calvarial defects are used to study bone composition during healing without discomposing the fracture. The model allowed to monitor the natural healing of bone avoiding mechanical harm to the callus. Calvarial defects were created using 1mm burr drill in the parietal bones of Sprague-Dawley rats (n=8) that served in vivo defects. After 7 days, their skulls were harvested after euthanizing. One additional defect per sample was created on the opposite parietal bone using same calvarial defect procedure to serve as control defect. Raman spectroscopy (785 nm) was established to investigate bone parameters of three different skull surfaces; in vivo defects, control defects and normal surface. Principal component analysis (PCA) was utilized for the data analysis and interpretation of Raman spectra and helped in the classification of groups. PCA was able to distinguish in vivo defects from normal surface and control defects. PC1 shows that the major variation at 958 cm⁻¹, which corresponds to ʋ1 phosphate mineral band. PC2 shows the major variation at 1448 cm⁻¹ which is the characteristic band of CH2 deformation and corresponds to collagens. Raman parameters, namely, mineral to matrix ratio and crystallinity was found significantly decreased in the in vivo defects compared to surface and controls. Scanning electron microscope and optical microscope images show the formation of newly generated matrix by means of bony bridges of collagens. Optical profiler shows that surface roughness increased by 30% from controls to in vivo defects after 7 days. These results agree with Raman assessment parameters and confirm the new collagen formation during healing.

Keywords: Raman spectroscopy, principal component analysis, calvarial defects, tissue characterization

Procedia PDF Downloads 197
1296 Human-Machine Cooperation in Facial Comparison Based on Likelihood Scores

Authors: Lanchi Xie, Zhihui Li, Zhigang Li, Guiqiang Wang, Lei Xu, Yuwen Yan

Abstract:

Image-based facial features can be classified into category recognition features and individual recognition features. Current automated face recognition systems extract a specific feature vector of different dimensions from a facial image according to their pre-trained neural network. However, to improve the efficiency of parameter calculation, an algorithm generally reduces the image details by pooling. The operation will overlook the details concerned much by forensic experts. In our experiment, we adopted a variety of face recognition algorithms based on deep learning, compared a large number of naturally collected face images with the known data of the same person's frontal ID photos. Downscaling and manual handling were performed on the testing images. The results supported that the facial recognition algorithms based on deep learning detected structural and morphological information and rarely focused on specific markers such as stains and moles. Overall performance, distribution of genuine scores and impostor scores, and likelihood ratios were tested to evaluate the accuracy of biometric systems and forensic experts. Experiments showed that the biometric systems were skilled in distinguishing category features, and forensic experts were better at discovering the individual features of human faces. In the proposed approach, a fusion was performed at the score level. At the specified false accept rate, the framework achieved a lower false reject rate. This paper contributes to improving the interpretability of the objective method of facial comparison and provides a novel method for human-machine collaboration in this field.

Keywords: likelihood ratio, automated facial recognition, facial comparison, biometrics

Procedia PDF Downloads 103
1295 Austempered Compacted Graphite Irons: Influence of Austempering Temperature on Microstructure and Microscratch Behavior

Authors: Rohollah Ghasemi, Arvin Ghorbani

Abstract:

This study investigates the effect of austempering temperature on microstructure and scratch behavior of the austempered heat-treated compacted graphite irons. The as-cast was used as base material for heat treatment practices. The samples were extracted from as-cast ferritic CGI pieces and were heat treated under austenitising temperature of 900°C for 60 minutes which followed by quenching in salt-bath at different austempering temperatures of 275°C, 325°C and 375°C. For all heat treatments, an austempering holding time of 30 minutes was selected for this study. Light optical microscope (LOM) and scanning electron microscope (SEM) and electron back scattered diffraction (EBSD) analysis confirmed the ausferritic matrix formed in all heat-treated samples. Microscratches were performed under the load of 200, 600 and 1000 mN using a sphero-conical diamond indenter with a tip radius of 50 μm and induced cone angle 90° at a speed of 10 μm/s at room temperature ~25°C. An instrumented nanoindentation machine was used for performing nanoindentation hardness measurement and microscratch testing. Hardness measurements and scratch resistance showed a significant increase in Brinell, Vickers, and nanoindentation hardness values as well as microscratch resistance of the heat-treated samples compared to the as-cast ferritic sample. The increase in hardness and improvement in microscratch resistance are associated with the formation of the ausferrite matrix consisted of carbon-saturated retained austenite and acicular ferrite in austempered matrix. The maximum hardness was observed for samples austempered at 275°C which resulted in the formation of very fine acicular ferrite. In addition, nanohardness values showed a quite significant variation in the matrix due to the presence of acicular ferrite and carbon-saturated retained austenite. It was also observed that the increase of austempering temperature resulted in increase of volume of the carbon-saturated retained austenite and decrease of hardness values.

Keywords: austempered CGI, austempering, scratch testing, scratch plastic deformation, scratch hardness

Procedia PDF Downloads 113
1294 Evaluation of Automated Analyzers of Polycyclic Aromatic Hydrocarbons and Black Carbon in a Coke Oven Plant by Comparison with Analytical Methods

Authors: L. Angiuli, L. Trizio, R. Giua, A. Digilio, M. Tutino, P. Dambruoso, F. Mazzone, C. M. Placentino

Abstract:

In the winter of 2014 a series of measurements were performed to evaluate the behavior of real-time PAHs and black carbon analyzers in a coke oven plant located in Taranto, a city of Southern Italy. Data were collected both insides than outside the plant, at air quality monitoring sites. Contemporary measures of PM2.5 and PM1 were performed. Particle-bound PAHs were measured by two methods: (1) aerosol photoionization using an Ecochem PAS 2000 analyzer, (2) PM2.5 and PM1 quartz filter collection and analysis by gas chromatography/mass spectrometry (GC/MS). Black carbon was determined both in real-time by Magee Aethalometer AE22 analyzer than by semi-continuous Sunset Lab EC/OC instrument. Detected PM2.5 and PM1 levels were higher inside than outside the plant while PAHs real-time values were higher outside than inside. As regards PAHs, inside the plant Ecochem PAS 2000 revealed concentrations not significantly different from those determined on the filter during low polluted days, but at increasing concentrations the automated instrument underestimated PAHs levels. At the external site, Ecochem PAS 2000 real-time concentrations were steadily higher than those on the filter. In the same way, real-time black carbon values were constantly lower than EC concentrations obtained by Sunset EC/OC in the inner site, while outside the plant real-time values were comparable to Sunset EC values. Results showed that in a coke plant real-time analyzers of PAHs and black carbon in the factory configuration provide qualitative information, with no accuracy and leading to the underestimation of the concentration. A site specific calibration is needed for these instruments before their installation in high polluted sites.

Keywords: black carbon, coke oven plant, PAH, PAS, aethalometer

Procedia PDF Downloads 315
1293 Phenomena-Based Approach for Automated Generation of Process Options and Process Models

Authors: Parminder Kaur Heer, Alexei Lapkin

Abstract:

Due to global challenges of increased competition and demand for more sustainable products/processes, there is a rising pressure on the industry to develop innovative processes. Through Process Intensification (PI) the existing and new processes may be able to attain higher efficiency. However, very few PI options are generally considered. This is because processes are typically analysed at a unit operation level, thus limiting the search space for potential process options. PI performed at more detailed levels of a process can increase the size of the search space. The different levels at which PI can be achieved is unit operations, functional and phenomena level. Physical/chemical phenomena form the lowest level of aggregation and thus, are expected to give the highest impact because all the intensification options can be described by their enhancement. The objective of the current work is thus, generation of numerous process alternatives based on phenomena, and development of their corresponding computer aided models. The methodology comprises: a) automated generation of process options, and b) automated generation of process models. The process under investigation is disintegrated into functions viz. reaction, separation etc., and these functions are further broken down into the phenomena required to perform them. E.g., separation may be performed via vapour-liquid or liquid-liquid equilibrium. A list of phenomena for the process is formed and new phenomena, which can overcome the difficulties/drawbacks of the current process or can enhance the effectiveness of the process, are added to the list. For instance, catalyst separation issue can be handled by using solid catalysts; the corresponding phenomena are identified and added. The phenomena are then combined to generate all possible combinations. However, not all combinations make sense and, hence, screening is carried out to discard the combinations that are meaningless. For example, phase change phenomena need the co-presence of the energy transfer phenomena. Feasible combinations of phenomena are then assigned to the functions they execute. A combination may accomplish a single or multiple functions, i.e. it might perform reaction or reaction with separation. The combinations are then allotted to the functions needed for the process. This creates a series of options for carrying out each function. Combination of these options for different functions in the process leads to the generation of superstructure of process options. These process options, which are formed by a list of phenomena for each function, are passed to the model generation algorithm in the form of binaries (1, 0). The algorithm gathers the active phenomena and couples them to generate the model. A series of models is generated for the functions, which are combined to get the process model. The most promising process options are then chosen subjected to a performance criterion, for example purity of product, or via a multi-objective Pareto optimisation. The methodology was applied to a two-step process and the best route was determined based on the higher product yield. The current methodology can identify, produce and evaluate process intensification options from which the optimal process can be determined. It can be applied to any chemical/biochemical process because of its generic nature.

Keywords: Phenomena, Process intensification, Process models , Process options

Procedia PDF Downloads 208
1292 Alumina Nanoparticles in One-Pot Synthesis of Pyrazolopyranopyrimidinones

Authors: Saeed Khodabakhshi, Alimorad Rashidi, Ziba Tavakoli, Sajad Kiani, Sadegh Dastkhoon

Abstract:

Alumina nanoparticles (γ-Al2O3 NPs) were prepared via a new and simple synthetic route and characterized by field emission scanning electron microscope, X-ray diffraction, and Fourier transform infrared spectroscopy. The catalytic activity of prepared γ-Al2O3 NPs was investigated for the one-pot, four-component synthesis of fused tri-heterocyclic compounds containing pyrazole, pyran, and pyrimidine. This procedure has some advantages such as high efficiency, simplicity, high rate and environmental safety.

Keywords: alumina nanoparticles, one-pot, fused tri-heterocyclic compounds, pyran

Procedia PDF Downloads 298
1291 Eosinopenia: Marker for Early Diagnosis of Enteric Fever

Authors: Swati Kapoor, Rajeev Upreti, Monica Mahajan, Abhaya Indrayan, Dinesh Srivastava

Abstract:

Enteric Fever is caused by gram negative bacilli Salmonella typhi and paratyphi. It is associated with high morbidity and mortality worldwide. Timely initiation of treatment is a crucial step for prevention of any complications. Cultures of body fluids are diagnostic, but not always conclusive or practically feasible in most centers. Moreover, the results of cultures delay the treatment initiation. Serological tests lack diagnostic value. The blood counts can offer a promising option in diagnosis. A retrospective study to find out the relevance of leucopenia and eosinopenia was conducted on 203 culture proven enteric fever patients and 159 culture proven non-enteric fever patients in a tertiary care hospital in New Delhi. The patient details were retrieved from the electronic medical records section of the hospital. Absolute eosinopenia was considered as absolute eosinophil count (AEC) of less than 40/mm³ (normal level: 40-400/mm³) using LH-750 Beckman Coulter Automated machine. Leucopoenia was defined as total leucocyte count (TLC) of less than 4 X 10⁹/l. Blood cultures were done using BacT/ALERT FA plus automated blood culture system before first antibiotic dose was given. Case and control groups were compared using Pearson Chi square test. It was observed that absolute eosinophil count (AEC) of 0-19/mm³ was a significant finding (p < 0.001) in enteric fever patients, whereas leucopenia was not a significant finding (p=0.096). Using Receiving Operating Characteristic (ROC) curves, it was observed that patients with both AEC < 14/mm³ and TCL < 8 x 10⁹/l had 95.6% chance of being diagnosed as enteric fever and only 4.4% chance of being diagnosed as non-enteric fever. This result was highly significant with p < 0.001. This is a very useful association of AEC and TLC found in enteric fever patients of this study which can be used for the early initiation of treatment in clinically suspected enteric fever patients.

Keywords: absolute eosinopenia, absolute eosinophil count, enteric fever, leucopenia, total leucocyte count

Procedia PDF Downloads 149
1290 An Automated Magnetic Dispersive Solid-Phase Extraction Method for Detection of Cocaine in Human Urine

Authors: Feiyu Yang, Chunfang Ni, Rong Wang, Yun Zou, Wenbin Liu, Chenggong Zhang, Fenjin Sun, Chun Wang

Abstract:

Cocaine is the most frequently used illegal drug globally, with the global annual prevalence of cocaine used ranging from 0.3% to 0.4 % of the adult population aged 15–64 years. Growing consumption trend of abused cocaine and drug crimes are a great concern, therefore urine sample testing has become an important noninvasive sampling whereas cocaine and its metabolites (COCs) are usually present in high concentrations and relatively long detection windows. However, direct analysis of urine samples is not feasible because urine complex medium often causes low sensitivity and selectivity of the determination. On the other hand, presence of low doses of analytes in urine makes an extraction and pretreatment step important before determination. Especially, in gathered taking drug cases, the pretreatment step becomes more tedious and time-consuming. So developing a sensitive, rapid and high-throughput method for detection of COCs in human body is indispensable for law enforcement officers, treatment specialists and health officials. In this work, a new automated magnetic dispersive solid-phase extraction (MDSPE) sampling method followed by high performance liquid chromatography-mass spectrometry (HPLC-MS) was developed for quantitative enrichment of COCs from human urine, using prepared magnetic nanoparticles as absorbants. The nanoparticles were prepared by silanizing magnetic Fe3O4 nanoparticles and modifying them with divinyl benzene and vinyl pyrrolidone, which possesses the ability for specific adsorption of COCs. And this kind of magnetic particle facilitated the pretreatment steps by electromagnetically controlled extraction to achieve full automation. The proposed device significantly improved the sampling preparation efficiency with 32 samples in one batch within 40mins. Optimization of the preparation procedure for the magnetic nanoparticles was explored and the performances of magnetic nanoparticles were characterized by scanning electron microscopy, vibrating sample magnetometer and infrared spectra measurements. Several analytical experimental parameters were studied, including amount of particles, adsorption time, elution solvent, extraction and desorption kinetics, and the verification of the proposed method was accomplished. The limits of detection for the cocaine and cocaine metabolites were 0.09-1.1 ng·mL-1 with recoveries ranging from 75.1 to 105.7%. Compared to traditional sampling method, this method is time-saving and environmentally friendly. It was confirmed that the proposed automated method was a kind of highly effective way for the trace cocaine and cocaine metabolites analyses in human urine.

Keywords: automatic magnetic dispersive solid-phase extraction, cocaine detection, magnetic nanoparticles, urine sample testing

Procedia PDF Downloads 177
1289 Development of Composition and Technology of Vincristine Nanoparticles Using High-Molecular Carbohydrates of Plant Origin

Authors: L. Ebralidze, A. Tsertsvadze, D. Berashvili, A. Bakuridze

Abstract:

Current cancer therapy strategies are based on surgery, radiotherapy and chemotherapy. The problems associated with chemotherapy are one of the biggest challenges for clinical medicine. These include: low specificity, broad spectrum of side effects, toxicity and development of cellular resistance. Therefore, anti-cance drugs need to be develop urgently. Particularly, in order to increase efficiency of anti-cancer drugs and reduce their side effects, scientists work on formulation of nano-drugs. The objective of this study was to develop composition and technology of vincristine nanoparticles using high-molecular carbohydrates of plant origin. Plant polysacharides, particularly, soy bean seed polysaccharides, flaxseed polysaccharides, citrus pectin, gum arabic, sodium alginate were used as objects. Based on biopharmaceutical research, vincristine containing nanoparticle formulations were prepared. High-energy emulsification and solvent evaporation methods were used for preparation of nanosystems. Polysorbat 80, polysorbat 60, sodium dodecyl sulfate, glycerol, polyvinyl alcohol were used in formulation as emulsifying agent and stabilizer of the system. The ratio of API and polysacharides, also the type of the stabilizing and emulsifying agents are very effective on the particle size of the final product. The influence of preparation technology, type and concentration of stabilizing agents on the properties of nanoparticles were evaluated. For the next stage of research, nanosystems were characterized. Physiochemical characterization of nanoparticles: their size, shape, distribution was performed using Atomic force microscope and Scanning electron microscope. The present study explored the possibility of production of NPs using plant polysaccharides. Optimal ratio of active pharmaceutical ingredient and plant polysacharids, the best stabilizer and emulsifying agent was determined. The average range of nanoparticles size and shape was visualized by SEM.

Keywords: nanoparticles, target delivery, natural high molecule carbohydrates, surfactants

Procedia PDF Downloads 244
1288 Revolutionizing Autonomous Trucking Logistics with Customer Relationship Management Cloud

Authors: Sharda Kumari, Saiman Shetty

Abstract:

Autonomous trucking is just one of the numerous significant shifts impacting fleet management services. The Society of Automotive Engineers (SAE) has defined six levels of vehicle automation that have been adopted internationally, including by the United States Department of Transportation. On public highways in the United States, organizations are testing driverless vehicles with at least Level 4 automation which indicates that a human is present in the vehicle and can disable automation, which is usually done while the trucks are not engaged in highway driving. However, completely driverless vehicles are presently being tested in the state of California. While autonomous trucking can increase safety, decrease trucking costs, provide solutions to trucker shortages, and improve efficiencies, logistics, too, requires advancements to keep up with trucking innovations. Given that artificial intelligence, machine learning, and automated procedures enable people to do their duties in other sectors with fewer resources, CRM (Customer Relationship Management) can be applied to the autonomous trucking business to provide the same level of efficiency. In a society witnessing significant digital disruptions, fleet management is likewise being transformed by technology. Utilizing strategic alliances to enhance core services is an effective technique for capitalizing on innovations and delivering enhanced services. Utilizing analytics on CRM systems improves cost control of fuel strategy, fleet maintenance, driver behavior, route planning, road safety compliance, and capacity utilization. Integration of autonomous trucks with automated fleet management, yard/terminal management, and customer service is possible, thus having significant power to redraw the lines between the public and private spheres in autonomous trucking logistics.

Keywords: autonomous vehicles, customer relationship management, customer experience, autonomous trucking, digital transformation

Procedia PDF Downloads 75
1287 Analysis of Structural and Photocatalytical Properties of Anatase, Rutile and Mixed Phase TiO2 Films Deposited by Pulsed-Direct Current and Radio Frequency Magnetron Co-Sputtering

Authors: S. Varnagiris, M. Urbonavicius, S. Tuckute, M. Lelis, K. Bockute

Abstract:

Amongst many water purification techniques, TiO2 photocatalysis is recognized as one of the most promising sustainable methods. It is known that for photocatalytical applications anatase is the most suitable TiO2 phase, however heterojunction of anatase/rutile phases could improve the photocatalytical activity of TiO2 even further. Despite the relative simplicity of TiO2 different synthesis methods lead to the highly dispersed crystal phases and photocatalytic activity of the corresponding samples. Accordingly, suggestions and investigations of various innovative methods of TiO2 synthesis are still needed. In this work structural and photocatalytical properties of TiO2 films deposited by the unconventional method of simultaneous co-sputtering from two magnetrons powered by pulsed-Direct Current (pDC) and Radio Frequency (RF) power sources with negative bias voltage have been studied. More specifically, TiO2 film thickness, microstructure, surface roughness, crystal structure, optical transmittance and photocatalytical properties were investigated by profilometer, scanning electron microscope, atomic force microscope, X-ray diffractometer and UV-Vis spectrophotometer respectively. The proposed unconventional two magnetron co-sputtering based TiO2 film formation method showed very promising results for crystalline TiO2 film formation while keeping process temperatures below 100 °C. XRD analysis revealed that by using proper combination of power source type and bias voltage various TiO2 phases (amorphous, anatase, rutile or their mixture) can be synthesized selectively. Moreover, strong dependency between power source type and surface roughness, as well as between the bias voltage and band gap value of TiO2 films was observed. Interestingly, TiO2 films deposited by two magnetron co-sputtering without bias voltage had one of the highest band gap values between the investigated films but its photocatalytic activity was superior compared to all other samples. It is suggested that this is due to the dominating nanocrystalline anatase phase with various exposed surfaces including photocatalytically the most active {001}.

Keywords: films, magnetron co-sputtering, photocatalysis, TiO₂

Procedia PDF Downloads 98
1286 The Integration of Digital Humanities into the Sociology of Knowledge Approach to Discourse Analysis

Authors: Gertraud Koch, Teresa Stumpf, Alejandra Tijerina García

Abstract:

Discourse analysis research approaches belong to the central research strategies applied throughout the humanities; they focus on the countless forms and ways digital texts and images shape present-day notions of the world. Despite the constantly growing number of relevant digital, multimodal discourse resources, digital humanities (DH) methods are thus far not systematically developed and accessible for discourse analysis approaches. Specifically, the significance of multimodality and meaning plurality modelling are yet to be sufficiently addressed. In order to address this research gap, the D-WISE project aims to develop a prototypical working environment as digital support for the sociology of knowledge approach to discourse analysis and new IT-analysis approaches for the use of context-oriented embedding representations. Playing an essential role throughout our research endeavor is the constant optimization of hermeneutical methodology in the use of (semi)automated processes and their corresponding epistemological reflection. Among the discourse analyses, the sociology of knowledge approach to discourse analysis is characterised by the reconstructive and accompanying research into the formation of knowledge systems in social negotiation processes. The approach analyses how dominant understandings of a phenomenon develop, i.e., the way they are expressed and consolidated by various actors in specific arenas of discourse until a specific understanding of the phenomenon and its socially accepted structure are established. This article presents insights and initial findings from D-WISE, a joint research project running since 2021 between the Institute of Anthropological Studies in Culture and History and the Language Technology Group of the Department of Informatics at the University of Hamburg. As an interdisciplinary team, we develop central innovations with regard to the availability of relevant DH applications by building up a uniform working environment, which supports the procedure of the sociology of knowledge approach to discourse analysis within open corpora and heterogeneous, multimodal data sources for researchers in the humanities. We are hereby expanding the existing range of DH methods by developing contextualized embeddings for improved modelling of the plurality of meaning and the integrated processing of multimodal data. The alignment of this methodological and technical innovation is based on the epistemological working methods according to grounded theory as a hermeneutic methodology. In order to systematically relate, compare, and reflect the approaches of structural-IT and hermeneutic-interpretative analysis, the discourse analysis is carried out both manually and digitally. Using the example of current discourses on digitization in the healthcare sector and the associated issues regarding data protection, we have manually built an initial data corpus of which the relevant actors and discourse positions are analysed in conventional qualitative discourse analysis. At the same time, we are building an extensive digital corpus on the same topic based on the use and further development of entity-centered research tools such as topic crawlers and automated newsreaders. In addition to the text material, this consists of multimodal sources such as images, video sequences, and apps. In a blended reading process, the data material is filtered, annotated, and finally coded with the help of NLP tools such as dependency parsing, named entity recognition, co-reference resolution, entity linking, sentiment analysis, and other project-specific tools that are being adapted and developed. The coding process is carried out (semi-)automated by programs that propose coding paradigms based on the calculated entities and their relationships. Simultaneously, these can be specifically trained by manual coding in a closed reading process and specified according to the content issues. Overall, this approach enables purely qualitative, fully automated, and semi-automated analyses to be compared and reflected upon.

Keywords: entanglement of structural IT and hermeneutic-interpretative analysis, multimodality, plurality of meaning, sociology of knowledge approach to discourse analysis

Procedia PDF Downloads 204
1285 Sexual Dimorphism in the Sensorial Structures of the Antenna of Thygater aethiops (Hymenoptera: Apidae) and Its Relation with Some Corporal Parameters

Authors: Wendy Carolina Gomez Ramirez, Rodulfo Ospina Torres

Abstract:

Thygater aethiops is a species of solitary bee with a neotropical distribution that has been adapted to live in urban environments. This species of bee presents a marked sexual dimorphism since the males have antenna almost as long as their body different from the females that present antenna with smaller size. In this work, placoid sensilla were studied, which are structures that appear in the antenna and are involved in the detection of substances both, for reproduction and for the search of food. The aim of this study was to evaluate the differences between these sensory structures in the different sexes, for which males and females were captured. Later some body measures were taken such as fresh weight with abdomen and without it, since the weight could be modified by the stomach content; other measures were taken as the total antenna length and length of the flagellum and flagelomere. After negative imprints of the antenna were made using nail polish, the imprint was cut with a microblade and mounted onto a microscope slide. The placoid sensilla were visible on the imprint, so they were counted manually on the 100x objective lens of the optical microscope. Initially, the males presented a specific distribution pattern in two types of sensilla: trichoid and placoid, the trichoid were found aligned in the dorsal face of the antenna and the placoid were distributed along the entire antenna; that was different to the females since they did not present a distribution pattern the sensilla were randomly organized. It was obtained that the males, because they have a longer antenna, have a greater number of sensilla in relation to the females. Additionally, it was found that there was no relationship between the weight and the number of sensilla, but there was a positive relationship between the length of the antenna, the length of the flagellum and the number of sensilla. The relationship between the number of sensilla per unit area in each of the sexes was also calculated, which showed that, on average, males have 4.2 ± 0.38 sensilla per unit area and females present 2.2 ± 0.20 and likewise a significant difference between sexes. This dimorphism found may be related to the sexual behavior of the species, since it has been demonstrated that males are more adapted to the perception of substances related to reproduction than to the search of food.

Keywords: antenna, olfactory organ, sensilla, sexual dimorphism, solitary bees

Procedia PDF Downloads 142
1284 Innovative Screening Tool Based on Physical Properties of Blood

Authors: Basant Singh Sikarwar, Mukesh Roy, Ayush Goyal, Priya Ranjan

Abstract:

This work combines two bodies of knowledge which includes biomedical basis of blood stain formation and fluid communities’ wisdom that such formation of blood stain depends heavily on physical properties. Moreover biomedical research tells that different patterns in stains of blood are robust indicator of blood donor’s health or lack thereof. Based on these valuable insights an innovative screening tool is proposed which can act as an aide in the diagnosis of diseases such Anemia, Hyperlipidaemia, Tuberculosis, Blood cancer, Leukemia, Malaria etc., with enhanced confidence in the proposed analysis. To realize this powerful technique, simple, robust and low-cost micro-fluidic devices, a micro-capillary viscometer and a pendant drop tensiometer are designed and proposed to be fabricated to measure the viscosity, surface tension and wettability of various blood samples. Once prognosis and diagnosis data has been generated, automated linear and nonlinear classifiers have been applied into the automated reasoning and presentation of results. A support vector machine (SVM) classifies data on a linear fashion. Discriminant analysis and nonlinear embedding’s are coupled with nonlinear manifold detection in data and detected decisions are made accordingly. In this way, physical properties can be used, using linear and non-linear classification techniques, for screening of various diseases in humans and cattle. Experiments are carried out to validate the physical properties measurement devices. This framework can be further developed towards a real life portable disease screening cum diagnostics tool. Small-scale production of screening cum diagnostic devices is proposed to carry out independent test.

Keywords: blood, physical properties, diagnostic, nonlinear, classifier, device, surface tension, viscosity, wettability

Procedia PDF Downloads 355
1283 Designing Automated Embedded Assessment to Assess Student Learning in a 3D Educational Video Game

Authors: Mehmet Oren, Susan Pedersen, Sevket C. Cetin

Abstract:

Despite the frequently criticized disadvantages of the traditional used paper and pencil assessment, it is the most frequently used method in our schools. Although assessments do an acceptable measurement, they are not capable of measuring all the aspects and the richness of learning and knowledge. Also, many assessments used in schools decontextualize the assessment from the learning, and they focus on learners’ standing on a particular topic but do not concentrate on how student learning changes over time. For these reasons, many scholars advocate that using simulations and games (S&G) as a tool for assessment has significant potentials to overcome the problems in traditionally used methods. S&G can benefit from the change in technology and provide a contextualized medium for assessment and teaching. Furthermore, S&G can serve as an instructional tool rather than a method to test students’ learning at a particular time point. To investigate the potentials of using educational games as an assessment and teaching tool, this study presents the implementation and the validation of an automated embedded assessment (AEA), which can constantly monitor student learning in the game and assess their performance without intervening their learning. The experiment was conducted on an undergraduate level engineering course (Digital Circuit Design) with 99 participant students over a period of five weeks in Spring 2016 school semester. The purpose of this research study is to examine if the proposed method of AEA is valid to assess student learning in a 3D Educational game and present the implementation steps. To address this question, this study inspects three aspects of the AEA for the validation. First, the evidence-centered design model was used to lay out the design and measurement steps of the assessment. Then, a confirmatory factor analysis was conducted to test if the assessment can measure the targeted latent constructs. Finally, the scores of the assessment were compared with an external measure (a validated test measuring student learning on digital circuit design) to evaluate the convergent validity of the assessment. The results of the confirmatory factor analysis showed that the fit of the model with three latent factors with one higher order factor was acceptable (RMSEA < 0.00, CFI =1, TLI=1.013, WRMR=0.390). All of the observed variables significantly loaded to the latent factors in the latent factor model. In the second analysis, a multiple regression analysis was used to test if the external measure significantly predicts students’ performance in the game. The results of the regression indicated the two predictors explained 36.3% of the variance (R2=.36, F(2,96)=27.42.56, p<.00). It was found that students’ posttest scores significantly predicted game performance (β = .60, p < .000). The statistical results of the analyses show that the AEA can distinctly measure three major components of the digital circuit design course. It was aimed that this study can help researchers understand how to design an AEA, and showcase an implementation by providing an example methodology to validate this type of assessment.

Keywords: educational video games, automated embedded assessment, assessment validation, game-based assessment, assessment design

Procedia PDF Downloads 401
1282 Automated Evaluation Approach for Time-Dependent Question Answering Pairs on Web Crawler Based Question Answering System

Authors: Shraddha Chaudhary, Raksha Agarwal, Niladri Chatterjee

Abstract:

This work demonstrates a web crawler-based generalized end-to-end open domain Question Answering (QA) system. An efficient QA system requires a significant amount of domain knowledge to answer any question with the aim to find an exact and correct answer in the form of a number, a noun, a short phrase, or a brief piece of text for the user's questions. Analysis of the question, searching the relevant document, and choosing an answer are three important steps in a QA system. This work uses a web scraper (Beautiful Soup) to extract K-documents from the web. The value of K can be calibrated on the basis of a trade-off between time and accuracy. This is followed by a passage ranking process using the MS-Marco dataset trained on 500K queries to extract the most relevant text passage, to shorten the lengthy documents. Further, a QA system is used to extract the answers from the shortened documents based on the query and return the top 3 answers. For evaluation of such systems, accuracy is judged by the exact match between predicted answers and gold answers. But automatic evaluation methods fail due to the linguistic ambiguities inherent in the questions. Moreover, reference answers are often not exhaustive or are out of date. Hence correct answers predicted by the system are often judged incorrect according to the automated metrics. One such scenario arises from the original Google Natural Question (GNQ) dataset which was collected and made available in the year 2016. Use of any such dataset proves to be inefficient with respect to any questions that have time-varying answers. For illustration, if the query is where will be the next Olympics? Gold Answer for the above query as given in the GNQ dataset is “Tokyo”. Since the dataset was collected in the year 2016, and the next Olympics after 2016 were in 2020 that was in Tokyo which is absolutely correct. But if the same question is asked in 2022 then the answer is “Paris, 2024”. Consequently, any evaluation based on the GNQ dataset will be incorrect. Such erroneous predictions are usually given to human evaluators for further validation which is quite expensive and time-consuming. To address this erroneous evaluation, the present work proposes an automated approach for evaluating time-dependent question-answer pairs. In particular, it proposes a metric using the current timestamp along with top-n predicted answers from a given QA system. To test the proposed approach GNQ dataset has been used and the system achieved an accuracy of 78% for a test dataset comprising 100 QA pairs. This test data was automatically extracted using an analysis-based approach from 10K QA pairs of the GNQ dataset. The results obtained are encouraging. The proposed technique appears to have the possibility of developing into a useful scheme for gathering precise, reliable, and specific information in a real-time and efficient manner. Our subsequent experiments will be guided towards establishing the efficacy of the above system for a larger set of time-dependent QA pairs.

Keywords: web-based information retrieval, open domain question answering system, time-varying QA, QA evaluation

Procedia PDF Downloads 76
1281 Zinc Oxide Nanowires: Device Fabrication and Optical Properties

Authors: Igori Wallace

Abstract:

Zinc oxide (ZnO) nanowires with hexagonal structure were successfully synthesized by the chemical bath deposition technique. The obtained nanowires were characterized by scanning electron microscope (SEM) and energy dispersive X-ray analysis (EDX). The SEM micrographs revealed the morphology of ZnO nanowires with the diameter between 170.3 and 481nm and showed that the normal pH of the bath solution, 8.1 is the optimized value to form ZnO nanowires with the hexagonal shape. The compositional (EDX) analysis revealed the elemental compositions of samples and confirmed the presence of Zn and O.

Keywords: crystallite, chemical bath deposition technique, hexagonal, morphology, nanowire

Procedia PDF Downloads 286