Search results for: software reuse
4074 Cache Analysis and Software Optimizations for Faster on-Chip Network Simulations
Authors: Khyamling Parane, B. M. Prabhu Prasad, Basavaraj Talawar
Abstract:
Fast simulations are critical in reducing time to market in CMPs and SoCs. Several simulators have been used to evaluate the performance and power consumed by Network-on-Chips. Researchers and designers rely upon these simulators for design space exploration of NoC architectures. Our experiments show that simulating large NoC topologies take hours to several days for completion. To speed up the simulations, it is necessary to investigate and optimize the hotspots in simulator source code. Among several simulators available, we choose Booksim2.0, as it is being extensively used in the NoC community. In this paper, we analyze the cache and memory system behaviour of Booksim2.0 to accurately monitor input dependent performance bottlenecks. Our measurements show that cache and memory usage patterns vary widely based on the input parameters given to Booksim2.0. Based on these measurements, the cache configuration having least misses has been identified. To further reduce the cache misses, we use software optimization techniques such as removal of unused functions, loop interchanging and replacing post-increment operator with pre-increment operator for non-primitive data types. The cache misses were reduced by 18.52%, 5.34% and 3.91% by employing above technology respectively. We also employ thread parallelization and vectorization to improve the overall performance of Booksim2.0. The OpenMP programming model and SIMD are used for parallelizing and vectorizing the more time-consuming portions of Booksim2.0. Speedups of 2.93x and 3.97x were observed for the Mesh topology with 30 × 30 network size by employing thread parallelization and vectorization respectively.Keywords: cache behaviour, network-on-chip, performance profiling, vectorization
Procedia PDF Downloads 2004073 Revitalization of the Chinese Residential at Lasem, Indonesia
Authors: Nurtati Soewarno, Dian Duhita
Abstract:
The existence of civilization from the past is recognized by the left objects such as monuments, buildings or even a town. The relics were designed and made well, using the good quality material so it could persist a long period of time. At this moment, those relics are cultural heritage that must be preserved and the authenticity maintained. Indonesia, a country consist of various tribes with many cultural heritages, one of them is the city of Lasem. Lasem city lies in the northern part of Central Java since the Majapahit kingdom era (13th century) poses as a busy harbor city and a trading center. Lasem is one of the residences of Chinese immigrants in Java, seen by the domination of Chinese architectural building styles. The residential was built since the 15th century and the building has the courtyard which is different from other China’s building in another part of Java. This city loses ground since the trade activity experience difficulties during the Japanese colonial era and continues after the Indonesian independence time. Many Chinese people left Lasem city and let the buildings empty not maintained. This paper will present the result of observation to Chinese architectural style buildings in Lasem city which still hold out until this moment. Using typo morphology method, the case study is chosen based on the transformation type. The occurring transformation is parallel with adaptive reuse concept as an effort to revitalize the existence of the buildings. With this concept, it is expected that the buildings could be re functioned and the glory of the foretime Lasem city could be experienced again. Intervention from the local government is expected, issuing regulations, hoping the new building functions won’t ruin the cultural heritage but instead beautifies it.Keywords: adaptive re-use, brown field area, building transformation, Lasem city
Procedia PDF Downloads 3654072 Theoretical Modal Analysis of Freely and Simply Supported RC Slabs
Authors: M. S. Ahmed, F. A. Mohammad
Abstract:
This paper focuses on the dynamic behavior of reinforced concrete (RC) slabs. Therefore, the theoretical modal analysis was performed using two different types of boundary conditions. Modal analysis method is the most important dynamic analyses. The analysis would be modal case when there is no external force on the structure. By using this method in this paper, the effects of freely and simply supported boundary conditions on the frequencies and mode shapes of RC square slabs are studied. ANSYS software was employed to derive the finite element model to determine the natural frequencies and mode shapes of the slabs. Then, the obtained results through numerical analysis (finite element analysis) would be compared with an exact solution. The main goal of the research study is to predict how the boundary conditions change the behavior of the slab structures prior to performing experimental modal analysis. Based on the results, it is concluded that simply support boundary condition has obvious influence to increase the natural frequencies and change the shape of mode when it is compared with freely supported boundary condition of slabs. This means that such support conditions have direct influence on the dynamic behavior of the slabs. Thus, it is suggested to use free-free boundary condition in experimental modal analysis to precisely reflect the properties of the structure. By using free-free boundary conditions, the influence of poorly defined supports is interrupted.Keywords: natural frequencies, mode shapes, modal analysis, ANSYS software, RC slabs
Procedia PDF Downloads 4574071 Development of Star Image Simulator for Star Tracker Algorithm Validation
Authors: Zoubida Mahi
Abstract:
A successful satellite mission in space requires a reliable attitude and orbit control system to command, control and position the satellite in appropriate orbits. Several sensors are used for attitude control, such as magnetic sensors, earth sensors, horizon sensors, gyroscopes, and solar sensors. The star tracker is the most accurate sensor compared to other sensors, and it is able to offer high-accuracy attitude control without the need for prior attitude information. There are mainly three approaches in star sensor research: digital simulation, hardware in the loop simulation, and field test of star observation. In the digital simulation approach, all of the processes are done in software, including star image simulation. Hence, it is necessary to develop star image simulation software that could simulate real space environments and various star sensor configurations. In this paper, we present a new stellar image simulation tool that is used to test and validate the stellar sensor algorithms; the developed tool allows to simulate of stellar images with several types of noise, such as background noise, gaussian noise, Poisson noise, multiplicative noise, and several scenarios that exist in space such as the presence of the moon, the presence of optical system problem, illumination and false objects. On the other hand, we present in this paper a new star extraction algorithm based on a new centroid calculation method. We compared our algorithm with other star extraction algorithms from the literature, and the results obtained show the star extraction capability of the proposed algorithm.Keywords: star tracker, star simulation, star detection, centroid, noise, scenario
Procedia PDF Downloads 974070 Simulation and Controller Tunning in a Photo-Bioreactor Applying by Taguchi Method
Authors: Hosein Ghahremani, MohammadReza Khoshchehre, Pejman Hakemi
Abstract:
This study involves numerical simulations of a vertical plate-type photo-bioreactor to investigate the performance of Microalgae Spirulina and Control and optimization of parameters for the digital controller by Taguchi method that MATLAB software and Qualitek-4 has been made. Since the addition of parameters such as temperature, dissolved carbon dioxide, biomass, and ... Some new physical parameters such as light intensity and physiological conditions like photosynthetic efficiency and light inhibitors are involved in biological processes, control is facing many challenges. Not only facilitate the commercial production photo-bioreactor Microalgae as feed for aquaculture and food supplements are efficient systems but also as a possible platform for the production of active molecules such as antibiotics or innovative anti-tumor agents, carbon dioxide removal and removal of heavy metals from wastewater is used. Digital controller is designed for controlling the light bioreactor until Microalgae growth rate and carbon dioxide concentration inside the bioreactor is investigated. The optimal values of the controller parameters of the S/N and ANOVA analysis software Qualitek-4 obtained With Reaction curve, Cohen-Con and Ziegler-Nichols method were compared. The sum of the squared error obtained for each of the control methods mentioned, the Taguchi method as the best method for controlling the light intensity was selected photo-bioreactor. This method compared to control methods listed the higher stability and a shorter interval to be answered.Keywords: photo-bioreactor, control and optimization, Light intensity, Taguchi method
Procedia PDF Downloads 3954069 Modal Analysis of Functionally Graded Materials Plates Using Finite Element Method
Authors: S. J. Shahidzadeh Tabatabaei, A. M. Fattahi
Abstract:
Modal analysis of an FGM plate composed of Al2O3 ceramic phase and 304 stainless steel metal phases was performed in this paper by ABAQUS software with the assumption that the behavior of material is elastic and mechanical properties (Young's modulus and density) are variable in the thickness direction of the plate. Therefore, a sub-program was written in FORTRAN programming language and was linked with ABAQUS software. For modal analysis, a finite element analysis was carried out similar to the model of other researchers and the accuracy of results was evaluated after comparing the results. Comparison of natural frequencies and mode shapes reflected the compatibility of results and optimal performance of the program written in FORTRAN as well as high accuracy of finite element model used in this research. After validation of the results, it was evaluated the effect of material (n parameter) on the natural frequency. In this regard, finite element analysis was carried out for different values of n and in simply supported mode. About the effect of n parameter that indicates the effect of material on the natural frequency, it was observed that the natural frequency decreased as n increased; because by increasing n, the share of ceramic phase on FGM plate has decreased and the share of steel phase has increased and this led to reducing stiffness of FGM plate and thereby reduce in the natural frequency. That is because the Young's modulus of Al2O3 ceramic is equal to 380 GPa and Young's modulus of SUS304 steel is 207 GPa.Keywords: FGM plates, modal analysis, natural frequency, finite element method
Procedia PDF Downloads 3914068 Molecular Comparison of HEV Isolates from Sewage & Humans at Western India
Authors: Nidhi S. Chandra, Veena Agrawal, Debprasad Chattopadhyay
Abstract:
Background: Hepatitis E virus (HEV) is a major cause of acute viral hepatitis in developing countries. It spreads feco orally mainly due to contamination of drinking water by sewage. There is limited data on the genotypic comparison of HEV isolates from sewage water and humans. The aim of this study was to identify genotype and conduct phylogenetic analysis of HEV isolates from sewage water and humans. Materials and Methods: 14 sewage water and 60 serum samples from acute sporadic hepatitis E cases (negative for hepatitis A, B, C) were tested for HEV-RNA by nested polymerase chain reaction (RTnPCR) using primers designed with in RdRp (RNA dependent RNA polymerase) region of open reading frame-1 (ORF-1). Sequencing was done by ABI prism 310. The sequences (343 nucleotides) were compared with each other and were aligned with previously reported HEV sequences obtained from GeneBank, using Clustal W software. A Phylogenetic tree was constructed by using PHYLIP version 3.67 software. Results: HEV-RNA was detected in 49/ 60 (81.67%) serum and 5/14 (35.71%) sewage samples. The sequences obtained from 17 serums and 2 sewage specimens belonged to genotype I with 85% similarity and clustering with previously reported human HEV sequences from India. HEV isolates from human and sewage in North West India are genetically closely related to each other. Conclusion: These finding suggest that sewage acts as reservoir of HEV. Therefore it is important that measures are taken for proper waste disposal and treatment of drinking water to prevent outbreaks and epidemics due to HEV.Keywords: hepatitis E virus, nested polymerase chain reaction, open reading frame-1, nucleotidies
Procedia PDF Downloads 3794067 Effects of Foliar Application of Glycine Betaine under Nickel Toxicity of Oat (Avena Sativa L.)
Authors: Khizar Hayat Bhatti, Fiza Javed, Misbah Zafar
Abstract:
Oat (Avena sativa L.) is a major cereal plant belonging to the family Poaceae. It is a very important source of carbohydrates, starch, minerals, vitamins and proteins that are beneficial for general health. Plants grow in the heavy metals contaminated soils that results in decline in growth. Glycine betaine application may improve plant growth, survival and resistance to metabolic disturbances due to stresses. Heavy metals, like nickels, have been accumulated for a long time in the soil because of industrial waste and sewage. The experiment was intended to alleviate the detrimental effects of heavy metal nickel stress on two oat varieties ‘Sgd-2011 and Hay’ using Glycine betain. Nickel was induced through soil application while GB was applied as foliar spray. After 10 days of nickel treatment, an exogenous spray of glycine betaine on the intact plant leaves. Data analysis was carried out using a Completely Randomized Design (CRD) with three replications in this study. For the analysis of all the data of the current research, Mini-Tab 19 software was used to compare the mean value of all treatments and Microsoft Excel software for generating the bars graphs. Significant accelerated plant growth was recorded when Ni exposed plants were treated with GB. Based on data findings, 3mM GB caused significant recovery from Ni stress doses. Overall results also demonstrated that the sgd-2011 variety of oats had the greatest outcomes for all parameters.Keywords: CRD, foliar spray method, glycine betaine, heavy metals, nickel, ROS
Procedia PDF Downloads 114066 Efficiency of a Molecularly Imprinted Polymer for Selective Removal of Chlorpyrifos from Water Samples
Authors: Oya A. Urucu, Aslı B. Çiğil, Hatice Birtane, Ece K. Yetimoğlu, Memet Vezir Kahraman
Abstract:
Chlorpyrifos is an organophosphorus pesticide which can be found in environmental water samples. The efficiency and reuse of a molecularly imprinted polymer (chlorpyrifos - MIP) were investigated for the selective removal of chlorpyrifos residues. MIP was prepared with UV curing thiol-ene polymerization technology by using multifunctional thiol and ene monomers. The thiol-ene curing reaction is a radical induced process, however unlike other photoinitiated polymerization processes, this polymerization process is a free-radical reaction that proceeds by a step-growth mechanism, involving two main steps; a free-radical addition followed by a chain transfer reaction. It assures a very rapidly formation of a uniform crosslinked network with low shrinkage, reduced oxygen inhibition during curing and excellent adhesion. In this study, thiol-ene based UV-curable polymeric materials were prepared by mixing pentaerythritol tetrakis(3-mercaptopropionate), glyoxal bis diallyl acetal, polyethylene glycol diacrylate (PEGDA) and photoinitiator. Chlorpyrifos was added at a definite ratio to the prepared formulation. Chemical structure and thermal properties were characterized by FTIR and thermogravimetric analysis (TGA), respectively. The pesticide analysis was performed by gas chromatography-mass spectrometry (GC-MS). The influences of some analytical parameters such as pH, sample volume, amounts of analyte concentration were studied for the quantitative recoveries of the analyte. The proposed MIP method was applied to the determination of chlorpyrifos in river and tap water samples. The use of the MIP provided a selective and easy solution for removing chlorpyrifos from the water.Keywords: molecularly imprinted polymers, selective removal, thilol-ene, uv-curable polymer
Procedia PDF Downloads 3024065 Integration of Technology through Instructional Systems Design
Authors: C. Salis, D. Zedda, M. F. Wilson
Abstract:
The IDEA project was conceived for teachers who are interested in enhancing their capacity to effectively implement the use of specific technologies in their teaching practice. Participating teachers are coached and supported as they explore technologies applied to the educational context. They access tools such as the technological platform developed by our team. Among the platform functionalities, teachers access an instructional systems design (ISD) tool (learning designer) that was adapted to the needs of our project. The tool is accessible from computers or mobile devices and used in association with other technologies to create new, meaningful learning environments. The objective of an instructional systems design is to guarantee the quality and effectiveness of education and to enhance learning. This goal involves both teachers who want to become more efficient in transferring knowledge or skills and students as the final recipient of their teaching. The use of Blooms’s taxonomy enables teachers to classify the learning objectives into levels of complexity and specificity, thus making it possible to highlight the kind of knowledge teachers would like their students to reach. The fact that the instructional design features can be visualized through the IDEA platform is a guarantee for those who are looking for specific educational materials to be used in their lessons. Despite the benefits offered, a number of teachers are reluctant to use ISD because the preparatory work of having to thoroughly analyze the teaching/learning objectives, the planning of learning material, assessment activities, etc., is long and felt to be time-consuming. This drawback is minimized using a learning designer, as the tool facilitates to reuse of the didactic contents having a clear view of the processes of analysis, planning, and production of educational or testing materials uploaded on our platform. In this paper, we shall present the feedback of the teachers who used our tool in their didactic.Keywords: educational benefits, educational quality, educational technology, ISD tool
Procedia PDF Downloads 1884064 Novel Bioinspired Design to Capture Smoky CO2 by Reactive Absorption with Aqueous Scrubber
Authors: J. E. O. Hernandez
Abstract:
In the next 20 years, energy production by burning fuels will increase and so will the atmospheric concentration of CO2 and its well-known threats to life on Earth. The technologies available for capturing CO2 are still dubious and this keeps fostering an interest in bio-inspired approaches. The leading one is the application of carbonic anhydrase (CA) –a superfast biocatalyst able to convert up to one million molecules of CO2 into carbonates in water. However, natural CA underperforms when applied to real smoky CO2 in chimneys and, so far, the efforts to create superior CAs in the lab rely on screening methods running under pristine conditions at the micro level, which are far from resembling those in chimneys. For the evolution of man-made enzymes, selection rather than screening would be ideal but this is challenging because of the need for a suitable artificial environment that is also sustainable for our society. Herein we present the stepwise design and construction of a bioprocess (from bench-scale to semi-pilot) for evolutionary selection experiments. In this bioprocess, reaction and adsorption took place simultaneously at atmospheric pressure in a spray tower. The scrubbing solution was fed countercurrently by reusing municipal pressure and it was mainly prepared with water, carbonic anhydrase and calcium chloride. This bioprocess allowed for the enzymatic carbonation of smoky CO2; the reuse of process water and the recovery of solid carbonates without cooling of smoke, pretreatments, solvent amines and compression of CO2. The average yield of solid carbonates was 0.54 g min-1 or 12-fold the amount produced in serum bottles at lab bench scale. This bioprocess could be used as a tailor-made environment for driving the selection of superior CAs. The bioprocess and its match CA could be sustainably used to reduce global warming by CO2 emissions from exhausts.Keywords: biological carbon capture and sequestration, carbonic anhydrase, directed evolution, global warming
Procedia PDF Downloads 1934063 The DAQ Debugger for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework
Procedia PDF Downloads 2844062 Valorization Cascade Approach of Fish By-Products towards a Zero-Waste Future: A Review
Authors: Joana Carvalho, Margarida Soares, André Ribeiro, Lucas Nascimento, Nádia Valério, Zlatina Genisheva
Abstract:
Following the exponential growth of human population, a remarkable increase in the amount of fish waste has been produced worldwide. The fish processing industry generates a considerable amount of by-products which represents a considerable environmental problem. Accordingly, the reuse and valorisation of these by-products is a key process for marine resource preservation. The significant volume of fish waste produced worldwide, along with its environmental impact, underscores the urgent need for the adoption of sustainable practices. The transformative potential of utilizing fish processing waste to create industrial value is gaining recognition. The substantial amounts of waste generated by the fish processing industry present both environmental challenges and economic inefficiencies. Different added-value products can be recovered by the valorisation industries, whereas fishing companies can save costs associated with the management of those wastes, with associated advantages, not only in terms of economic income but also considering the environmental impacts. Fish processing by-products have numerous applications; the target portfolio of products will be fish oil, fish protein hydrolysates, bacteriocins, pigments, vitamins, collagen, and calcium-rich powder, targeting food products, additives, supplements, and nutraceuticals. This literature review focuses on the main valorisation ways of fish wastes and different compounds with a high commercial value obtained by fish by-products and their possible applications in different fields. Highlighting its potential in sustainable resource management strategies can play and important role in reshaping the fish processing industry, driving it towards circular economy and consequently more sustainable future.Keywords: fish process industry, fish wastes, by-products, circular economy, sustainability
Procedia PDF Downloads 204061 Reuse of Wastewater After Pretreatment Under Teril and Sand in Bechar City
Authors: Sara Seddiki, Maazouzi Abdelhak
Abstract:
The main objective of this modest work is to follow the physicochemical and bacteriological evolution of the wastewater from the town of Bechar subjected to purification by filtration according to various local supports, namely Sable and Terrill by reducing nuisances that undergo the receiving environment (Oued Bechar) and therefore make this water source reusable in different areas. The study first made it possible to characterize the urban wastewater of the Bechar wadi, which presents an environmental threat, thus allowing an estimation of the pollutant load, the chemical oxygen demand COD (145 mg / l) and the biological oxygen demand BOD5 (72 mg / l) revealed that these waters are less biodegradable (COD / BOD5 ratio = 0.62), have a fairly high conductivity (2.76 mS/cm), and high levels of mineral matter presented by chlorides and sulphates 390 and 596.1 mg / l respectively, with a pH of 8.1. The characterization of the sand dune (Beni Abbes) shows that quartz (97%) is the most present mineral. The granular analysis allowed us to determine certain parameters like the uniformity coefficient (CU) and the equivalent diameter, and scanning electron microscope (SEM) observations and X-ray analysis were performed. The study of filtered wastewater shows satisfactory and very encouraging treatment results, with complete elimination of total coliforms and streptococci and a good reduction of total aerobic germs in the sand and clay-sand filter. A good yield has been reported in the sand Terrill filter for the reduction of turbidity. The rates of reduction of organic matter in terms of the biological oxygen demand, in chemical oxygen demand recorded, are of the order of 60%. The elimination of sulphates is 40% for the sand filter.Keywords: urban wastewater, filtration, bacteriological and physicochemical parameters, sand, Terrill, Oued Bechar
Procedia PDF Downloads 974060 Linkage Disequilibrium and Haplotype Blocks Study from Two High-Density Panels and a Combined Panel in Nelore Beef Cattle
Authors: Priscila A. Bernardes, Marcos E. Buzanskas, Luciana C. A. Regitano, Ricardo V. Ventura, Danisio P. Munari
Abstract:
Genotype imputation has been used to reduce genomic selections costs. In order to increase haplotype detection accuracy in methods that considers the linkage disequilibrium, another approach could be used, such as combined genotype data from different panels. Therefore, this study aimed to evaluate the linkage disequilibrium and haplotype blocks in two high-density panels before and after the imputation to a combined panel in Nelore beef cattle. A total of 814 animals were genotyped with the Illumina BovineHD BeadChip (IHD), wherein 93 animals (23 bulls and 70 progenies) were also genotyped with the Affymetrix Axion Genome-Wide BOS 1 Array Plate (AHD). After the quality control, 809 IHD animals (509,107 SNPs) and 93 AHD (427,875 SNPs) remained for analyses. The combined genotype panel (CP) was constructed by merging both panels after quality control, resulting in 880,336 SNPs. Imputation analysis was conducted using software FImpute v.2.2b. The reference (CP) and target (IHD) populations consisted of 23 bulls and 786 animals, respectively. The linkage disequilibrium and haplotype blocks studies were carried out for IHD, AHD, and imputed CP. Two linkage disequilibrium measures were considered; the correlation coefficient between alleles from two loci (r²) and the |D’|. Both measures were calculated using the software PLINK. The haplotypes' blocks were estimated using the software Haploview. The r² measurement presented different decay when compared to |D’|, wherein AHD and IHD had almost the same decay. For r², even with possible overestimation by the sample size for AHD (93 animals), the IHD presented higher values when compared to AHD for shorter distances, but with the increase of distance, both panels presented similar values. The r² measurement is influenced by the minor allele frequency of the pair of SNPs, which can cause the observed difference comparing the r² decay and |D’| decay. As a sum of the combinations between Illumina and Affymetrix panels, the CP presented a decay equivalent to a mean of these combinations. The estimated haplotype blocks detected for IHD, AHD, and CP were 84,529, 63,967, and 140,336, respectively. The IHD were composed by haplotype blocks with mean of 137.70 ± 219.05kb, the AHD with mean of 102.10kb ± 155.47, and the CP with mean of 107.10kb ± 169.14. The majority of the haplotype blocks of these three panels were composed by less than 10 SNPs, with only 3,882 (IHD), 193 (AHD) and 8,462 (CP) haplotype blocks composed by 10 SNPs or more. There was an increase in the number of chromosomes covered with long haplotypes when CP was used as well as an increase in haplotype coverage for short chromosomes (23-29), which can contribute for studies that explore haplotype blocks. In general, using CP could be an alternative to increase density and number of haplotype blocks, increasing the probability to obtain a marker close to a quantitative trait loci of interest.Keywords: Bos taurus indicus, decay, genotype imputation, single nucleotide polymorphism
Procedia PDF Downloads 2814059 A Programming Assessment Software Artefact Enhanced with the Help of Learners
Authors: Romeo A. Botes, Imelda Smit
Abstract:
The demands of an ever changing and complex higher education environment, along with the profile of modern learners challenge current approaches to assessment and feedback. More learners enter the education system every year. The younger generation expects immediate feedback. At the same time, feedback should be meaningful. The assessment of practical activities in programming poses a particular problem, since both lecturers and learners in the information and computer science discipline acknowledge that paper-based assessment for programming subjects lacks meaningful real-life testing. At the same time, feedback lacks promptness, consistency, comprehensiveness and individualisation. Most of these aspects may be addressed by modern, technology-assisted assessment. The focus of this paper is the continuous development of an artefact that is used to assist the lecturer in the assessment and feedback of practical programming activities in a senior database programming class. The artefact was developed using three Design Science Research cycles. The first implementation allowed one programming activity submission per assessment intervention. This pilot provided valuable insight into the obstacles regarding the implementation of this type of assessment tool. A second implementation improved the initial version to allow multiple programming activity submissions per assessment. The focus of this version is on providing scaffold feedback to the learner – allowing improvement with each subsequent submission. It also has a built-in capability to provide the lecturer with information regarding the key problem areas of each assessment intervention.Keywords: programming, computer-aided assessment, technology-assisted assessment, programming assessment software, design science research, mixed-method
Procedia PDF Downloads 2964058 Greywater Treatment Using Activated Biochar Produced from Agricultural Waste
Authors: Pascal Mwenge, Tumisang Seodigeng
Abstract:
The increase in urbanisation in South Africa has led to an increase in water demand and a decline in freshwater supply. Despite this, poor water usage is still a major challenge in South Africa, for instance, freshwater is still used for non-drinking applications. The freshwater shortage can be alleviated by using other sources of water for non-portable purposes such as greywater treated with activated biochar produced from agricultural waste. The success of activated biochar produced from agricultural waste to treat greywater can be both economically and environmentally beneficial. Greywater treated with activated biochar produced from agricultural waste is considered a cost-effective wastewater treatment. This work was aimed at determining the ability of activated biochar to remove Total Suspended Solids (TSS), Ammonium (NH4-N), Nitrate (NO3-N), and Chemical Oxygen Demand (COD) from greywater. The experiments were carried out in 800 ml laboratory plastic cylinders used as filter columns. 2.5 cm layer of gravel was used at the bottom and top of the column to sandwich the activated biochar material. Activated biochar (200 g and 400 g) was loaded in a column and used as a filter medium for greywater. Samples were collected after a week and sent for analysis. Four types of greywater were treated: Kitchen, floor cleaning water, shower and laundry water. The findings showed: 95% removal of TSS, 76% of NO3-N and 63% of COD on kitchen greywater and 85% removal of NH4-N on bathroom greywater, as highest removal of efficiency of the studied pollutants. The results showed that activated biochar produced from agricultural waste reduces a certain amount of pollutants from greywater. The results also indicated the ability of activated biochar to treat greywater for onsite non-potable reuse purposes.Keywords: activated biochar produced from agriculture waste, ammonium, NH₄-N, chemical oxygen demand, COD, greywater, nitrate, NO₃-N, total suspended solids, TSS
Procedia PDF Downloads 2044057 Reconstruction Spectral Reflectance Cube Based on Artificial Neural Network for Multispectral Imaging System
Authors: Iwan Cony Setiadi, Aulia M. T. Nasution
Abstract:
The multispectral imaging (MSI) technique has been used for skin analysis, especially for distant mapping of in-vivo skin chromophores by analyzing spectral data at each reflected image pixel. For ergonomic purpose, our multispectral imaging system is decomposed in two parts: a light source compartment based on LED with 11 different wavelenghts and a monochromatic 8-Bit CCD camera with C-Mount Objective Lens. The software based on GUI MATLAB to control the system was also developed. Our system provides 11 monoband images and is coupled with a software reconstructing hyperspectral cubes from these multispectral images. In this paper, we proposed a new method to build a hyperspectral reflectance cube based on artificial neural network algorithm. After preliminary corrections, a neural network is trained using the 32 natural color from X-Rite Color Checker Passport. The learning procedure involves acquisition, by a spectrophotometer. This neural network is then used to retrieve a megapixel multispectral cube between 380 and 880 nm with a 5 nm resolution from a low-spectral-resolution multispectral acquisition. As hyperspectral cubes contain spectra for each pixel; comparison should be done between the theoretical values from the spectrophotometer and the reconstructed spectrum. To evaluate the performance of reconstruction, we used the Goodness of Fit Coefficient (GFC) and Root Mean Squared Error (RMSE). To validate reconstruction, the set of 8 colour patches reconstructed by our MSI system and the one recorded by the spectrophotometer were compared. The average GFC was 0.9990 (standard deviation = 0.0010) and the average RMSE is 0.2167 (standard deviation = 0.064).Keywords: multispectral imaging, reflectance cube, spectral reconstruction, artificial neural network
Procedia PDF Downloads 3234056 The Role of Information Technology in Supply Chain Management
Authors: V. Jagadeesh, K. Venkata Subbaiah, P. Govinda Rao
Abstract:
This paper explaining about the significance of information technology tools and software packages in supply chain management (SCM) in order to manage the entire supply chain. Managing materials flow and financial flow and information flow effectively and efficiently with the aid of information technology tools and packages in order to deliver right quantity with right quality of goods at right time by using right methods and technology. Information technology plays a vital role in streamlining the sales forecasting and demand planning and Inventory control and transportation in supply networks and finally deals with production planning and scheduling. It achieves the objectives by streamlining the business process and integrates within the enterprise and its extended enterprise. SCM starts with customer and it involves sequence of activities from customer, retailer, distributor, manufacturer and supplier within the supply chain framework. It is the process of integrating demand planning and supply network planning and production planning and control. Forecasting indicates the direction for planning raw materials in order to meet the production planning requirements. Inventory control and transportation planning allocate the optimal or economic order quantity by utilizing shortest possible routes to deliver the goods to the customer. Production planning and control utilize the optimal resources mix in order to meet the capacity requirement planning. The above operations can be achieved by using appropriate information technology tools and software packages for the supply chain management.Keywords: supply chain management, information technology, business process, extended enterprise
Procedia PDF Downloads 3784055 Effective Dose and Size Specific Dose Estimation with and without Tube Current Modulation for Thoracic Computed Tomography Examinations: A Phantom Study
Authors: S. Gharbi, S. Labidi, M. Mars, M. Chelli, F. Ladeb
Abstract:
The purpose of this study is to reduce radiation dose for chest CT examination by including Tube Current Modulation (TCM) to a standard CT protocol. A scan of an anthropomorphic male Alderson phantom was performed on a 128-slice scanner. The estimation of effective dose (ED) in both scans with and without mAs modulation was done via multiplication of Dose Length Product (DLP) to a conversion factor. Results were compared to those measured with a CT-Expo software. The size specific dose estimation (SSDE) values were obtained by multiplication of the volume CT dose index (CTDIvol) with a conversion size factor related to the phantom’s effective diameter. Objective assessment of image quality was performed with Signal to Noise Ratio (SNR) measurements in phantom. SPSS software was used for data analysis. Results showed including CARE Dose 4D; ED was lowered by 48.35% and 51.51% using DLP and CT-expo, respectively. In addition, ED ranges between 7.01 mSv and 6.6 mSv in case of standard protocol, while it ranges between 3.62 mSv and 3.2 mSv with TCM. Similar results are found for SSDE; dose was higher without TCM of 16.25 mGy and was lower by 48.8% including TCM. The SNR values calculated were significantly different (p=0.03<0.05). The highest one is measured on images acquired with TCM and reconstructed with Filtered back projection (FBP). In conclusion, this study proves the potential of TCM technique in SSDE and ED reduction and in conserving image quality with high diagnostic reference level for thoracic CT examinations.Keywords: anthropomorphic phantom, computed tomography, CT-expo, radiation dose
Procedia PDF Downloads 2224054 Development of a Matlab® Program for the Bi-Dimensional Truss Analysis Using the Stiffness Matrix Method
Authors: Angel G. De Leon Hernandez
Abstract:
A structure is defined as a physical system or, in certain cases, an arrangement of connected elements, capable of bearing certain loads. The structures are presented in every part of the daily life, e.g., in the designing of buildings, vehicles and mechanisms. The main goal of a structure designer is to develop a secure, aesthetic and maintainable system, considering the constraint imposed to every case. With the advances in the technology during the last decades, the capabilities of solving engineering problems have increased enormously. Nowadays the computers, play a critical roll in the structural analysis, pitifully, for university students the vast majority of these software are inaccessible due to the high complexity and cost they represent, even when the software manufacturers offer student versions. This is exactly the reason why the idea of developing a more reachable and easy-to-use computing tool. This program is designed as a tool for the university students enrolled in courser related to the structures analysis and designs, as a complementary instrument to achieve a better understanding of this area and to avoid all the tedious calculations. Also, the program can be useful for graduated engineers in the field of structural design and analysis. A graphical user interphase is included in the program to make it even simpler to operate it and understand the information requested and the obtained results. In the present document are included the theoretical basics in which the program is based to solve the structural analysis, the logical path followed in order to develop the program, the theoretical results, a discussion about the results and the validation of those results.Keywords: stiffness matrix method, structural analysis, Matlab® applications, programming
Procedia PDF Downloads 1234053 Spatial Variation of Nitrogen, Phosphorus and Potassium Contents of Tomato (Solanum lycopersicum L.) Plants Grown in Greenhouses (Springs) in Elmali-Antalya Region
Authors: Namik Kemal Sonmez, Sahriye Sonmez, Hasan Rasit Turkkan, Hatice Tuba Selcuk
Abstract:
In this study, the spatial variation of plant and soil nutrition contents of tomato plants grown in greenhouses was investigated in Elmalı region of Antalya. For this purpose, total of 19 sampling points were determined. Coordinates of each sampling points were recorded by using a hand-held GPS device and were transferred to satellite data in GIS. Soil samples were collected from two different depths, 0-20 and 20-40 cm, and leaf were taken from different tomato greenhouses. The soil and plant samples were analyzed for N, P and K. Then, attribute tables were created with the analyses results by using GIS. Data were analyzed and semivariogram models and parameters (nugget, sill and range) of variables were determined by using GIS software. Kriged maps of variables were created by using nugget, sill and range values with geostatistical extension of ArcGIS software. Kriged maps of the N, P and K contents of plant and soil samples showed patchy or a relatively smooth distribution in the study areas. As a result, the N content of plants were sufficient approximately 66% portion of the tomato productions. It was determined that the P and K contents were sufficient of 70% and 80% portion of the areas, respectively. On the other hand, soil total K contents were generally adequate and available N and P contents were found to be highly good enough in two depths (0-20 and 20-40 cm) 90% portion of the areas.Keywords: Elmali, nutrients, springs greenhouses, spatial variation, tomato
Procedia PDF Downloads 2434052 Navigating Construction Project Outcomes: Synergy Through the Evolution of Digital Innovation and Strategic Management
Authors: Derrick Mirindi, Frederic Mirindi, Oluwakemi Oshineye
Abstract:
The ongoing high rate of construction project failures worldwide is often blamed on the difficulties of managing stakeholders. This highlights the crucial role of strategic management (SM) in achieving project success. This study investigates how integrating digital tools into the SM framework can effectively address stakeholder-related challenges. This work specifically focuses on the impact of evolving digital tools, such as Project Management Software (PMS) (e.g., Basecamp and Wrike), Building Information Modeling (BIM) (e.g., Tekla BIMsight and Autodesk Navisworks), Virtual and Augmented Reality (VR/AR) (e.g., Microsoft HoloLens), drones and remote monitoring, and social media and Web-Based platforms, in improving stakeholder engagement and project outcomes. Through existing literature with examples of failed projects, the study highlights how the evolution of digital tools will serve as facilitators within the strategic management process. These tools offer benefits such as real-time data access, enhanced visualization, and more efficient workflows to mitigate stakeholder challenges in construction projects. The findings indicate that integrating digital tools with SM principles effectively addresses stakeholder challenges, resulting in improved project outcomes and stakeholder satisfaction. The research advocates for a combined approach that embraces both strategic management and digital innovation to navigate the complex stakeholder landscape in construction projects.Keywords: strategic management, digital tools, virtual and augmented reality, stakeholder management, building information modeling, project management software
Procedia PDF Downloads 844051 The Impact of Regulatory Changes on the Development of Mobile Medical Apps
Abstract:
Mobile applications are being used to perform a wide variety of tasks in day-to-day life, ranging from checking email to controlling your home heating. Application developers have recognized the potential to transform a smart device into a medical device, by using a mobile medical application i.e. a mobile phone or a tablet. When initially conceived these mobile medical applications performed basic functions e.g. BMI calculator, accessing reference material etc.; however, increasing complexity offers clinicians and patients a range of functionality. As this complexity and functionality increases, so too does the potential risk associated with using such an application. Examples include any applications that provide the ability to inflate and deflate blood pressure cuffs, as well as applications that use patient-specific parameters and calculate dosage or create a dosage plan for radiation therapy. If an unapproved mobile medical application is marketed by a medical device organization, then they face significant penalties such as receiving an FDA warning letter to cease the prohibited activity, fines and possibility of facing a criminal conviction. Regulatory bodies have finalized guidance intended for mobile application developers to establish if their applications are subject to regulatory scrutiny. However, regulatory controls appear contradictory with the approaches taken by mobile application developers who generally work with short development cycles and very little documentation and as such, there is the potential to stifle further improvements due to these regulations. The research presented as part of this paper details how by adopting development techniques, such as agile software development, mobile medical application developers can meet regulatory requirements whilst still fostering innovation.Keywords: agile, applications, FDA, medical, mobile, regulations, software engineering, standards
Procedia PDF Downloads 3614050 Modeling Palm Oil Quality During the Ripening Process of Fresh Fruits
Authors: Afshin Keshvadi, Johari Endan, Haniff Harun, Desa Ahmad, Farah Saleena
Abstract:
Experiments were conducted to develop a model for analyzing the ripening process of oil palm fresh fruits in relation to oil yield and oil quality of palm oil produced. This research was carried out on 8-year-old Tenera (Dura × Pisifera) palms planted in 2003 at the Malaysian Palm Oil Board Research Station. Fresh fruit bunches were harvested from designated palms during January till May of 2010. The bunches were divided into three regions (top, middle and bottom), and fruits from the outer and inner layers were randomly sampled for analysis at 8, 12, 16 and 20 weeks after anthesis to establish relationships between maturity and oil development in the mesocarp and kernel. Computations on data related to ripening time, oil content and oil quality were performed using several computer software programs (MSTAT-C, SAS and Microsoft Excel). Nine nonlinear mathematical models were utilized using MATLAB software to fit the data collected. The results showed mean mesocarp oil percent increased from 1.24 % at 8 weeks after anthesis to 29.6 % at 20 weeks after anthesis. Fruits from the top part of the bunch had the highest mesocarp oil content of 10.09 %. The lowest kernel oil percent of 0.03 % was recorded at 12 weeks after anthesis. Palmitic acid and oleic acid comprised of more than 73 % of total mesocarp fatty acids at 8 weeks after anthesis, and increased to more than 80 % at fruit maturity at 20 weeks. The Logistic model with the highest R2 and the lowest root mean square error was found to be the best fit model.Keywords: oil palm, oil yield, ripening process, anthesis, fatty acids, modeling
Procedia PDF Downloads 3164049 Development of Methods for Plastic Injection Mold Weight Reduction
Authors: Bita Mohajernia, R. J. Urbanic
Abstract:
Mold making techniques have focused on meeting the customers’ functional and process requirements; however, today, molds are increasing in size and sophistication, and are difficult to manufacture, transport, and set up due to their size and mass. Presently, mold weight saving techniques focus on pockets to reduce the mass of the mold, but the overall size is still large, which introduces costs related to the stock material purchase, processing time for process planning, machining and validation, and excess waste materials. Reducing the overall size of the mold is desirable for many reasons, but the functional requirements, tool life, and durability cannot be compromised in the process. It is proposed to use Finite Element Analysis simulation tools to model the forces, and pressures to determine where the material can be removed. The potential results of this project will reduce manufacturing costs. In this study, a light weight structure is defined by an optimal distribution of material to carry external loads. The optimization objective of this research is to determine methods to provide the optimum layout for the mold structure. The topology optimization method is utilized to improve structural stiffness while decreasing the weight using the OptiStruct software. The optimized CAD model is compared with the primary geometry of the mold from the NX software. Results of optimization show an 8% weight reduction while the actual performance of the optimized structure, validated by physical testing, is similar to the original structure.Keywords: finite element analysis, plastic injection molding, topology optimization, weight reduction
Procedia PDF Downloads 2904048 Digital Platform for Psychological Assessment Supported by Sensors and Efficiency Algorithms
Authors: Francisco M. Silva
Abstract:
Technology is evolving, creating an impact on our everyday lives and the telehealth industry. Telehealth encapsulates the provision of healthcare services and information via a technological approach. There are several benefits of using web-based methods to provide healthcare help. Nonetheless, few health and psychological help approaches combine this method with wearable sensors. This paper aims to create an online platform for users to receive self-care help and information using wearable sensors. In addition, researchers developing a similar project obtain a solid foundation as a reference. This study provides descriptions and analyses of the software and hardware architecture. Exhibits and explains a heart rate dynamic and efficient algorithm that continuously calculates the desired sensors' values. Presents diagrams that illustrate the website deployment process and the webserver means of handling the sensors' data. The goal is to create a working project using Arduino compatible hardware. Heart rate sensors send their data values to an online platform. A microcontroller board uses an algorithm to calculate the sensor heart rate values and outputs it to a web server. The platform visualizes the sensor's data, summarizes it in a report, and creates alerts for the user. Results showed a solid project structure and communication from the hardware and software. The web server displays the conveyed heart rate sensor's data on the online platform, presenting observations and evaluations.Keywords: Arduino, heart rate BPM, microcontroller board, telehealth, wearable sensors, web-based healthcare
Procedia PDF Downloads 1274047 A Review on Cloud Computing and Internet of Things
Authors: Sahar S. Tabrizi, Dogan Ibrahim
Abstract:
Cloud Computing is a convenient model for on-demand networks that uses shared pools of virtual configurable computing resources, such as servers, networks, storage devices, applications, etc. The cloud serves as an environment for companies and organizations to use infrastructure resources without making any purchases and they can access such resources wherever and whenever they need. Cloud computing is useful to overcome a number of problems in various Information Technology (IT) domains such as Geographical Information Systems (GIS), Scientific Research, e-Governance Systems, Decision Support Systems, ERP, Web Application Development, Mobile Technology, etc. Companies can use Cloud Computing services to store large amounts of data that can be accessed from anywhere on Earth and also at any time. Such services are rented by the client companies where the actual rent depends upon the amount of data stored on the cloud and also the amount of processing power used in a given time period. The resources offered by the cloud service companies are flexible in the sense that the user companies can increase or decrease their storage requirements or the processing power requirements at any time, thus minimizing the overall rental cost of the service they receive. In addition, the Cloud Computing service providers offer fast processors and applications software that can be shared by their clients. This is especially important for small companies with limited budgets which cannot afford to purchase their own expensive hardware and software. This paper is an overview of the Cloud Computing, giving its types, principles, advantages, and disadvantages. In addition, the paper gives some example engineering applications of Cloud Computing and makes suggestions for possible future applications in the field of engineering.Keywords: cloud computing, cloud systems, cloud services, IaaS, PaaS, SaaS
Procedia PDF Downloads 2344046 Fire Protection Performance of Different Industrial Intumescent Coatings for Steel Beams
Authors: Serkan Kocapinar, Gülay Altay
Abstract:
This study investigates the efficiency of two different industrial intumescent coatings which have different types of certifications, in the fire protection performance in steel beams in the case of ISO 834 fire for 2 hours. A better understanding of industrial intumescent coatings, which assure structural integrity and prevent a collapse of steel structures, is needed to minimize the fire risks in steel structures. A comparison and understanding of different fire protective intumescent coatings, which are Product A and Product B, are used as a thermal barrier between the steel components and the fire. Product A is tested according to EN 13381-8 and BS 476-20,22 and is certificated by ISO Standards. Product B is tested according to EN 13381-8 and ASTM UL-94 and is certificated by the Turkish Standards Institute (TSE). Generally, fire tests to evaluate the fire performance of steel components are done numerically with commercial software instead of experiments due to the high cost of an ISO 834 fire test in a furnace. Hence, there is a gap in the literature about the comparisons of different certificated intumescent coatings for fire protection in the case of ISO 834 fire in a furnace experiment for 2 hours. The experiment was carried out by using two 1-meter UPN 200 steel sections. Each one was coated by different industrial intumescent coatings. A furnace was used by the Turkish Standards Institute (TSE) for the experiment. The temperature of the protected steels and the inside of the furnace was measured with the help of 24 thermocouples which were applied before the intumescent coatings during the two hours for the performance of intumescent coatings by getting a temperature-time curve of steel components. FIN EC software was used to determine the critical temperatures of protected steels, and Abaqus was used for thermal analysis to get theoretical results to compare with the experimental results.Keywords: fire safety, structural steel, ABAQUS, thermal analysis, FIN EC, intumescent coatings
Procedia PDF Downloads 1034045 Estimation of Effective Radiation Dose Following Computed Tomography Urography at Aminu Kano Teaching Hospital, Kano Nigeria
Authors: Idris Garba, Aisha Rabiu Abdullahi, Mansur Yahuza, Akintade Dare
Abstract:
Background: CT urography (CTU) is efficient radiological examination for the evaluation of the urinary system disorders. However, patients are exposed to a significant radiation dose which is in a way associated with increased cancer risks. Objectives: To determine Computed Tomography Dose Index following CTU, and to evaluate organs equivalent doses. Materials and Methods: A prospective cohort study was carried at a tertiary institution located in Kano northwestern. Ethical clearance was sought and obtained from the research ethics board of the institution. Demographic, scan parameters and CT radiation dose data were obtained from patients that had CTU procedure. Effective dose, organ equivalent doses, and cancer risks were estimated using SPSS statistical software version 16 and CT dose calculator software. Result: A total of 56 patients were included in the study, consisting of 29 males and 27 females. The common indication for CTU examination was found to be renal cyst seen commonly among young adults (15-44yrs). CT radiation dose values in DLP, CTDI and effective dose for CTU were 2320 mGy cm, CTDIw 9.67 mGy and 35.04 mSv respectively. The probability of cancer risks was estimated to be 600 per a million CTU examinations. Conclusion: In this study, the radiation dose for CTU is considered significantly high, with increase in cancer risks probability. Wide radiation dose variations between patient doses suggest that optimization is not fulfilled yet. Patient radiation dose estimate should be taken into consideration when imaging protocols are established for CT urography.Keywords: CT urography, cancer risks, effective dose, radiation exposure
Procedia PDF Downloads 345