Search results for: parallel processing
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 4760

Search results for: parallel processing

1520 Value Chain Analysis of Melon “Egusi” (Citrullus lanatus Thunb. Mansf) among Rural Farm Enterprises in South East, Nigeria

Authors: Chigozirim Onwusiribe, Jude Mbanasor

Abstract:

Egusi Melon (Citrullus Lanatus Thunb. Mansf ) is a very important oil seed that serves a major ingredient in the diet of most of the households in Nigeria. Egusi Melon is very nutritious and very important in meeting the food security needs of Nigerians. Egusi Melon is cultivated in most farm enterprise in South East Nigeria but the profitability of its value chain needs to be investigated. This study analyzed the profitability of the Egusi Melon value chain. Specifically this study developed a value chain map for Egusi Melon, analysed the profitability of each stage of the Egusi Melon Value chain and analysed the determinants of the profitability of the Egusi Melon at each stage of the value chain. Multi stage sampling technique was used to select 125 farm enterprises with similar capacity and characteristics. Questionnaire and interview were used to elicit the required data while descriptive statistics, Food and Agriculture Organization Value Chain Analysis Tool, profitability ratios and multiple regression analysis were used for the data analysis. One of the findings showed that the stages of the Egusi Melon value chain are very profitable. Based on the findings, we recommend the provision of grants by government and donor agencies to the farm enterprises through their cooperative societies, this will provide the necessary funds for the local fabrication of value addition and processing equipment to suit their unique value addition needs not met by the imported equipment.

Keywords: value, chain, melon, farm, enterprises

Procedia PDF Downloads 135
1519 Integrating of Multi-Criteria Decision Making and Spatial Data Warehouse in Geographic Information System

Authors: Zohra Mekranfar, Ahmed Saidi, Abdellah Mebrek

Abstract:

This work aims to develop multi-criteria decision making (MCDM) and spatial data warehouse (SDW) methods, which will be integrated into a GIS according to a ‘GIS dominant’ approach. The GIS operating tools will be operational to operate the SDW. The MCDM methods can provide many solutions to a set of problems with various and multiple criteria. When the problem is so complex, integrating spatial dimension, it makes sense to combine the MCDM process with other approaches like data mining, ascending analyses, we present in this paper an experiment showing a geo-decisional methodology of SWD construction, On-line analytical processing (OLAP) technology which combines both basic multidimensional analysis and the concepts of data mining provides powerful tools to highlight inductions and information not obvious by traditional tools. However, these OLAP tools become more complex in the presence of the spatial dimension. The integration of OLAP with a GIS is the future geographic and spatial information solution. GIS offers advanced functions for the acquisition, storage, analysis, and display of geographic information. However, their effectiveness for complex spatial analysis is questionable due to their determinism and their decisional rigor. A prerequisite for the implementation of any analysis or exploration of spatial data requires the construction and structuring of a spatial data warehouse (SDW). This SDW must be easily usable by the GIS and by the tools offered by an OLAP system.

Keywords: data warehouse, GIS, MCDM, SOLAP

Procedia PDF Downloads 178
1518 ATR-IR Study of the Mechanism of Aluminum Chloride Induced Alzheimer Disease - Curative and Protective Effect of Lepidium sativum Water Extract on Hippocampus Rats Brain Tissue

Authors: Maha J. Balgoon, Gehan A. Raouf, Safaa Y. Qusti, Soad S. Ali

Abstract:

The main cause of Alzheimer disease (AD) was believed to be mainly due to the accumulation of free radicals owing to oxidative stress (OS) in brain tissue. The mechanism of the neurotoxicity of Aluminum chloride (AlCl3) induced AD in hippocampus Albino wister rat brain tissue, the curative & the protective effects of Lipidium sativum group (LS) water extract were assessed after 8 weeks by attenuated total reflection spectroscopy ATR-IR and histologically by light microscope. ATR-IR results revealed that the membrane phospholipid undergo free radical attacks, mediated by AlCl3, primary affects the polyunsaturated fatty acids indicated by the increased of the olefinic -C=CH sub-band area around 3012 cm-1 from the curve fitting analysis. The narrowing in the half band width(HBW) of the sνCH2 sub-band around 2852 cm-1 due to Al intoxication indicates the presence of trans form fatty acids rather than gauch rotomer. The degradation of hydrocarbon chain to shorter chain length, increasing in membrane fluidity, disorder and decreasing in lipid polarity in AlCl3 group were indicated by the detected changes in certain calculated area ratios compared to the control. Administration of LS was greatly improved these parameters compared to the AlCl3 group. Al influences the Aβ aggregation and plaque formation, which in turn interferes to and disrupts the membrane structure. The results also showed a marked increase in the β-parallel and antiparallel structure, that characterize the Aβ formation in Al-induced AD hippocampal brain tissue, indicated by the detected increase in both amide I sub-bands around 1674, 1692 cm-1. This drastic increase in Aβ formation was greatly reduced in the curative and protective groups compared to the AlCl3 group and approaches nearly the control values. These results were supported too by the light microscope. AlCl3 group showed significant marked degenerative changes in hippocampal neurons. Most cells appeared small, shrieked and deformed. Interestingly, the administration of LS in curative and protective groups markedly decreases the amount of degenerated cells compared to the non-treated group. Also the intensity of congo red stained cells was decreased. Hippocampal neurons looked more/or less similar to those of control. This study showed a promising therapeutic effect of Lipidium sativum group (LS) on AD rat model that seriously overcome the signs of oxidative stress on membrane lipid and restore the protein misfolding.

Keywords: aluminum chloride, alzheimer disease, ATR-IR, Lipidium sativum

Procedia PDF Downloads 367
1517 Computer-Aided Classification of Liver Lesions Using Contrasting Features Difference

Authors: Hussein Alahmer, Amr Ahmed

Abstract:

Liver cancer is one of the common diseases that cause the death. Early detection is important to diagnose and reduce the incidence of death. Improvements in medical imaging and image processing techniques have significantly enhanced interpretation of medical images. Computer-Aided Diagnosis (CAD) systems based on these techniques play a vital role in the early detection of liver disease and hence reduce liver cancer death rate.  This paper presents an automated CAD system consists of three stages; firstly, automatic liver segmentation and lesion’s detection. Secondly, extracting features. Finally, classifying liver lesions into benign and malignant by using the novel contrasting feature-difference approach. Several types of intensity, texture features are extracted from both; the lesion area and its surrounding normal liver tissue. The difference between the features of both areas is then used as the new lesion descriptors. Machine learning classifiers are then trained on the new descriptors to automatically classify liver lesions into benign or malignant. The experimental results show promising improvements. Moreover, the proposed approach can overcome the problems of varying ranges of intensity and textures between patients, demographics, and imaging devices and settings.

Keywords: CAD system, difference of feature, fuzzy c means, lesion detection, liver segmentation

Procedia PDF Downloads 325
1516 TerraEnhance: High-Resolution Digital Elevation Model Generation using GANs

Authors: Siddharth Sarma, Ayush Majumdar, Nidhi Sabu, Mufaddal Jiruwaala, Shilpa Paygude

Abstract:

Digital Elevation Models (DEMs) are digital representations of the Earth’s topography, which include information about the elevation, slope, aspect, and other terrain attributes. DEMs play a crucial role in various applications, including terrain analysis, urban planning, and environmental modeling. In this paper, TerraEnhance is proposed, a distinct approach for high-resolution DEM generation using Generative Adversarial Networks (GANs) combined with Real-ESRGANs. By learning from a dataset of low-resolution DEMs, the GANs are trained to upscale the data by 10 times, resulting in significantly enhanced DEMs with improved resolution and finer details. The integration of Real-ESRGANs further enhances visual quality, leading to more accurate representations of the terrain. A post-processing layer is introduced, employing high-pass filtering to refine the generated DEMs, preserving important details while reducing noise and artifacts. The results demonstrate that TerraEnhance outperforms existing methods, producing high-fidelity DEMs with intricate terrain features and exceptional accuracy. These advancements make TerraEnhance suitable for various applications, such as terrain analysis and precise environmental modeling.

Keywords: DEM, ESRGAN, image upscaling, super resolution, computer vision

Procedia PDF Downloads 10
1515 Financial Analysis of the Foreign Direct in Mexico

Authors: Juan Peña Aguilar, Lilia Villasana, Rodrigo Valencia, Alberto Pastrana, Martin Vivanco, Juan Peña C

Abstract:

Each year a growing number of companies entering Mexico in search of the domestic market share. These activities, including stores, telephone long distance and local raw materials and energy, and particularly the financial sector, have managed to significantly increase its weight in the flows of FDI in Mexico , however, you should consider whether these trends FDI are positive for the Mexican economy and these activities increase Mexican exports in the medium term , and its share in GDP , gross fixed capital formation and employment. In general stresses that these activities, by far, have been unable to significantly generate linkages with the rest of the economy, a process that has not favored with competitiveness policies and activities aimed at these neutral or horizontal. Since the nineties foreign direct investment (FDI) has shown a remarkable dynamism, both internationally and in Latin America and in Mexico. Only in Mexico the first recipient of FDI in importance in Latin America during 1990-1995 and was displaced by Brazil since FDI increased from levels below 1 % of GDP during the eighties to around 3 % of GDP during the nineties. Its impact has been significant not only from a macroeconomic perspective , it has also allowed the generation of a new industrial production structure and organization, parallel to a significant modernization of a segment of the economy. The case of Mexico also is particularly interesting and relevant because the destination of FDI until 1993 had focused on the purchase of state assets during privatization process. This paper aims to present FDI flows in Mexico and analyze the different business strategies that have been touched and encouraged by the FDI. On the one hand, looking briefly discuss regulatory issues and source and recipient of FDI sectors. Furthermore, the paper presents in more detail the impacts and changes that generated the FDI contribution of FDI in the Mexican economy , besides the macroeconomic context and later legislative changes that resulted in the current regulations is examined around FDI in Mexico, including aspects of the Free Trade Agreement (NAFTA). It is worth noting that foreign investment can not only be considered from the perspective of the receiving economic units. Instead, these flows also reflect the strategic interests of transnational corporations (TNCs) and other companies seeking access to markets and increased competitiveness of their production networks and global distribution, among other reasons. Similarly it is important to note that foreign investment in its various forms is critically dependent on historical and temporal aspects. Thus, the same functionality can vary significantly depending on the specific characteristics of both receptor units as sources of FDI, including macroeconomic, institutional, industrial organization, and social aspects, among others.

Keywords: foreign direct investment (FDI), competitiveness, neoliberal regime, globalization, gross domestic product (GDP), NAFTA, macroeconomic

Procedia PDF Downloads 450
1514 A Comparative Study of Global Power Grids and Global Fossil Energy Pipelines Using GIS Technology

Authors: Wenhao Wang, Xinzhi Xu, Limin Feng, Wei Cong

Abstract:

This paper comprehensively investigates current development status of global power grids and fossil energy pipelines (oil and natural gas), proposes a standard visual platform of global power and fossil energy based on Geographic Information System (GIS) technology. In this visual platform, a series of systematic visual models is proposed with global spatial data, systematic energy and power parameters. Under this visual platform, the current Global Power Grids Map and Global Fossil Energy Pipelines Map are plotted within more than 140 countries and regions across the world. Using the multi-scale fusion data processing and modeling methods, the world’s global fossil energy pipelines and power grids information system basic database is established, which provides important data supporting global fossil energy and electricity research. Finally, through the systematic and comparative study of global fossil energy pipelines and global power grids, the general status of global fossil energy and electricity development are reviewed, and energy transition in key areas are evaluated and analyzed. Through the comparison analysis of fossil energy and clean energy, the direction of relevant research is pointed out for clean development and energy transition.

Keywords: energy transition, geographic information system, fossil energy, power systems

Procedia PDF Downloads 151
1513 Alloy Design of Single Crystal Ni-base Superalloys by Combined Method of Neural Network and CALPHAD

Authors: Mehdi Montakhabrazlighi, Ercan Balikci

Abstract:

The neural network (NN) method is applied to alloy development of single crystal Ni-base Superalloys with low density and improved mechanical strength. A set of 1200 dataset which includes chemical composition of the alloys, applied stress and temperature as inputs and density and time to rupture as outputs is used for training and testing the network. Thermodynamic phase diagram modeling of the screened alloys is performed with Thermocalc software to model the equilibrium phases and also microsegregation in solidification processing. The model is first trained by 80% of the data and the 20% rest is used to test it. Comparing the predicted values and the experimental ones showed that a well-trained network is capable of accurately predicting the density and time to rupture strength of the Ni-base superalloys. Modeling results is used to determine the effect of alloying elements, stress, temperature and gamma-prime phase volume fraction on rupture strength of the Ni-base superalloys. This approach is in line with the materials genome initiative and integrated computed materials engineering approaches promoted recently with the aim of reducing the cost and time for development of new alloys for critical aerospace components. This work has been funded by TUBITAK under grant number 112M783.

Keywords: neural network, rupture strength, superalloy, thermocalc

Procedia PDF Downloads 315
1512 Genome Sequencing of the Yeast Saccharomyces cerevisiae Strain 202-3

Authors: Yina A. Cifuentes Triana, Andrés M. Pinzón Velásco, Marío E. Velásquez Lozano

Abstract:

In this work the sequencing and genome characterization of a natural isolate of Saccharomyces cerevisiae yeast (strain 202-3), identified with potential for the production of second generation ethanol from sugarcane bagasse hydrolysates is presented. This strain was selected because its capability to consume xylose during the fermentation of sugarcane bagasse hydrolysates, taking into account that many strains of S. cerevisiae are incapable of processing this sugar. This advantage and other prominent positive aspects during fermentation profiles evaluated in bagasse hydrolysates made the strain 202-3 a candidate strain to improve the production of second-generation ethanol, which was proposed as a first step to study the strain at the genomic level. The molecular characterization was carried out by genome sequencing with the Illumina HiSeq 2000 platform paired end; the assembly was performed with different programs, finally choosing the assembler ABYSS with kmer 89. Gene prediction was developed with the approach of hidden Markov models with Augustus. The genes identified were scored based on similarity with public databases of nucleotide and protein. Records were organized from ontological functions at different hierarchical levels, which identified central metabolic functions and roles of the S. cerevisiae strain 202-3, highlighting the presence of four possible new proteins, two of them probably associated with the positive consumption of xylose.

Keywords: cellulosic ethanol, Saccharomyces cerevisiae, genome sequencing, xylose consumption

Procedia PDF Downloads 320
1511 Cladding Technology for Metal-Hybrid Composites with Network-Structure

Authors: Ha-Guk Jeong, Jong-Beom Lee

Abstract:

Cladding process is very typical technology for manufacturing composite materials by the hydrostatic extrusion. Because there is no friction between the metal and the container, it can be easily obtained in uniform flow during the deformation. The general manufacturing process for a metal-matrix composite in the solid state, mixing metal powders and ceramic powders with a suited volume ratio, prior to be compressed or extruded at the cold or hot condition in a can. Since through a plurality of unit processing steps of dispersing the materials having a large difference in their characteristics and physical mixing, the process is complicated and leads to non-uniform dispersion of ceramics. It is difficult and hard to reach a uniform ideal property in the coherence problems at the interface between the metal and the ceramic reinforcements. Metal hybrid composites, which presented in this report, are manufactured through the traditional plastic deformation processes like hydrostatic extrusion, caliber-rolling, and drawing. By the previous process, the realization of uniform macro and microstructure is surely possible. In this study, as a constituent material, aluminum, copper, and titanium have been used, according to the component ratio, excellent characteristics of each material were possible to produce a metal hybrid composite that appears to maximize. MgB₂ superconductor wire also fabricated via the same process. It will be introduced to their unique artistic and thermal characteristics.

Keywords: cladding process, metal-hybrid composites, hydrostatic extrusion, electronic/thermal characteristics

Procedia PDF Downloads 180
1510 Application of Supervised Deep Learning-based Machine Learning to Manage Smart Homes

Authors: Ahmed Al-Adaileh

Abstract:

Renewable energy sources, domestic storage systems, controllable loads and machine learning technologies will be key components of future smart homes management systems. An energy management scheme that uses a Deep Learning (DL) approach to support the smart home management systems, which consist of a standalone photovoltaic system, storage unit, heating ventilation air-conditioning system and a set of conventional and smart appliances, is presented. The objective of the proposed scheme is to apply DL-based machine learning to predict various running parameters within a smart home's environment to achieve maximum comfort levels for occupants, reduced electricity bills, and less dependency on the public grid. The problem is using Reinforcement learning, where decisions are taken based on applying the Continuous-time Markov Decision Process. The main contribution of this research is the proposed framework that applies DL to enhance the system's supervised dataset to offer unlimited chances to effectively support smart home systems. A case study involving a set of conventional and smart appliances with dedicated processing units in an inhabited building can demonstrate the validity of the proposed framework. A visualization graph can show "before" and "after" results.

Keywords: smart homes systems, machine learning, deep learning, Markov Decision Process

Procedia PDF Downloads 202
1509 Roughness Discrimination Using Bioinspired Tactile Sensors

Authors: Zhengkun Yi

Abstract:

Surface texture discrimination using artificial tactile sensors has attracted increasing attentions in the past decade as it can endow technical and robot systems with a key missing ability. However, as a major component of texture, roughness has rarely been explored. This paper presents an approach for tactile surface roughness discrimination, which includes two parts: (1) design and fabrication of a bioinspired artificial fingertip, and (2) tactile signal processing for tactile surface roughness discrimination. The bioinspired fingertip is comprised of two polydimethylsiloxane (PDMS) layers, a polymethyl methacrylate (PMMA) bar, and two perpendicular polyvinylidene difluoride (PVDF) film sensors. This artificial fingertip mimics human fingertips in three aspects: (1) Elastic properties of epidermis and dermis in human skin are replicated by the two PDMS layers with different stiffness, (2) The PMMA bar serves the role analogous to that of a bone, and (3) PVDF film sensors emulate Meissner’s corpuscles in terms of both location and response to the vibratory stimuli. Various extracted features and classification algorithms including support vector machines (SVM) and k-nearest neighbors (kNN) are examined for tactile surface roughness discrimination. Eight standard rough surfaces with roughness values (Ra) of 50 μm, 25 μm, 12.5 μm, 6.3 μm 3.2 μm, 1.6 μm, 0.8 μm, and 0.4 μm are explored. The highest classification accuracy of (82.6 ± 10.8) % can be achieved using solely one PVDF film sensor with kNN (k = 9) classifier and the standard deviation feature.

Keywords: bioinspired fingertip, classifier, feature extraction, roughness discrimination

Procedia PDF Downloads 312
1508 Bakla Po Ako (I Am Gay): A Case Study on the Communication Styles of Selected Filipino Gays in Disclosing Their Sexual Orientation to Their Parents

Authors: Bryan Christian Baybay, M. Francesca Ronario

Abstract:

This study is intended to answer the question “What are the communication styles of selected Filipino gays in breaking their silence on their sexual orientation to their parents?” In this regard, six cases of Filipino gay disclosures were examined through in-depth interviews. The participants were selected through purposive sampling and snowball technique. The theories, Rhetorical Sensitivity of Roderick Hart and Communicator Style of Robert Norton were used to analyze the gathered data and to give support to the communication attitudes, message processing, message rendering and communication styles exhibited in each disclosure. As secondary data and validation, parents and experts in the field of communication, sociology, and psychology were also interviewed and consulted. The study found that Filipino gays vary in the communication styles they use during the disclosure with their parents. All communication styles: impression-leaving, contentious, open, dramatic, dominant, precise, relaxed, friendly, animated, and communicator image were observed by the gays depending on their motivation, relationship and thoughts contemplated. These results lend ideas for future researchers to look into the communication patterns and/or styles of lesbians, bisexuals, transgenders and queers or expand researches on the same subject and the utilization of Social Judgment and Relational Dialectics theories in determining and analyzing LGBTQ communication.

Keywords: communication attitudes, communication styles, Filipino gays, self-disclosure, sexual orientation

Procedia PDF Downloads 523
1507 Identifying Biomarker Response Patterns to Vitamin D Supplementation in Type 2 Diabetes Using K-means Clustering: A Meta-Analytic Approach to Glycemic and Lipid Profile Modulation

Authors: Oluwafunmibi Omotayo Fasanya, Augustine Kena Adjei

Abstract:

Background and Aims: This meta-analysis aimed to evaluate the effect of vitamin D supplementation on key metabolic and cardiovascular parameters, such as glycated hemoglobin (HbA1C), fasting blood sugar (FBS), low-density lipoprotein (LDL), high-density lipoprotein (HDL), systolic blood pressure (SBP), and total vitamin D levels in patients with Type 2 diabetes mellitus (T2DM). Methods: A systematic search was performed across databases, including PubMed, Scopus, Embase, Web of Science, Cochrane Library, and ClinicalTrials.gov, from January 1990 to January 2024. A total of 4,177 relevant studies were initially identified. Using an unsupervised K-means clustering algorithm, publications were grouped based on common text features. Maximum entropy classification was then applied to filter studies that matched a pre-identified training set of 139 potentially relevant articles. These selected studies were manually screened for relevance. A parallel manual selection of all initially searched studies was conducted for validation. The final inclusion of studies was based on full-text evaluation, quality assessment, and meta-regression models using random effects. Sensitivity analysis and publication bias assessments were also performed to ensure robustness. Results: The unsupervised K-means clustering algorithm grouped the patients based on their responses to vitamin D supplementation, using key biomarkers such as HbA1C, FBS, LDL, HDL, SBP, and total vitamin D levels. Two primary clusters emerged: one representing patients who experienced significant improvements in these markers and another showing minimal or no change. Patients in the cluster associated with significant improvement exhibited lower HbA1C, FBS, and LDL levels after vitamin D supplementation, while HDL and total vitamin D levels increased. The analysis showed that vitamin D supplementation was particularly effective in reducing HbA1C, FBS, and LDL within this cluster. Furthermore, BMI, weight gain, and disease duration were identified as factors that influenced cluster assignment, with patients having lower BMI and shorter disease duration being more likely to belong to the improvement cluster. Conclusion: The findings of this machine learning-assisted meta-analysis confirm that vitamin D supplementation can significantly improve glycemic control and reduce the risk of cardiovascular complications in T2DM patients. The use of automated screening techniques streamlined the process, ensuring the comprehensive evaluation of a large body of evidence while maintaining the validity of traditional manual review processes.

Keywords: HbA1C, T2DM, SBP, FBS

Procedia PDF Downloads 12
1506 The European Research and Development Project Improved Nuclear Site Characterization for Waste Minimization in Decommissioning under Constrained Environment: Focus on Performance Analysis and Overall Uncertainty

Authors: M. Crozet, D. Roudil, T. Branger, S. Boden, P. Peerani, B. Russell, M. Herranz, L. Aldave de la Heras

Abstract:

The EURATOM work program project INSIDER (Improved Nuclear Site Characterization for Waste minimization in Decommissioning under Constrained Environment) was launched in June 2017. This 4-year project has 18 partners and aims at improving the management of contaminated materials arising from decommissioning and dismantling (D&D) operations by proposing an integrated methodology of characterization. This methodology is based on advanced statistical processing and modelling, coupled with adapted and innovative analytical and measurement methods, with respect to sustainability and economic objectives. In order to achieve these objectives, the approaches will be then applied to common case studies in the form of Inter-laboratory comparisons on matrix representative reference samples and benchmarking. Work Package 6 (WP6) ‘Performance analysis and overall uncertainty’ is in charge of the analysis of the benchmarking on real samples, the organisation of inter-laboratory comparison on synthetic certified reference materials and the establishment of overall uncertainty budget. Assessment of the outcome will be used for providing recommendations and guidance resulting in pre-standardization tests.

Keywords: decommissioning, sampling strategy, research and development, characterization, European project

Procedia PDF Downloads 364
1505 Recommendations Using Online Water Quality Sensors for Chlorinated Drinking Water Monitoring at Drinking Water Distribution Systems Exposed to Glyphosate

Authors: Angela Maria Fasnacht

Abstract:

Detection of anomalies due to contaminants’ presence, also known as early detection systems in water treatment plants, has become a critical point that deserves an in-depth study for their improvement and adaptation to current requirements. The design of these systems requires a detailed analysis and processing of the data in real-time, so it is necessary to apply various statistical methods appropriate to the data generated, such as Spearman’s Correlation, Factor Analysis, Cross-Correlation, and k-fold Cross-validation. Statistical analysis and methods allow the evaluation of large data sets to model the behavior of variables; in this sense, statistical treatment or analysis could be considered a vital step to be able to develop advanced models focused on machine learning that allows optimized data management in real-time, applied to early detection systems in water treatment processes. These techniques facilitate the development of new technologies used in advanced sensors. In this work, these methods were applied to identify the possible correlations between the measured parameters and the presence of the glyphosate contaminant in the single-pass system. The interaction between the initial concentration of glyphosate and the location of the sensors on the reading of the reported parameters was studied.

Keywords: glyphosate, emergent contaminants, machine learning, probes, sensors, predictive

Procedia PDF Downloads 122
1504 Improving Perceptual Reasoning in School Children through Chess Training

Authors: Ebenezer Joseph, Veena Easvaradoss, S. Sundar Manoharan, David Chandran, Sumathi Chandrasekaran, T. R. Uma

Abstract:

Perceptual reasoning is the ability that incorporates fluid reasoning, spatial processing, and visual motor integration. Several theories of cognitive functioning emphasize the importance of fluid reasoning. The ability to manipulate abstractions and rules and to generalize is required for reasoning tasks. This study, funded by the Cognitive Science Research Initiative, Department of Science and Technology, Government of India, analyzed the effect of 1-year chess training on the perceptual reasoning of children. A pretest–posttest with control group design was used, with 43 (28 boys, 15 girls) children in the experimental group and 42 (26 boys, 16 girls) children in the control group. The sample was selected from children studying in two private schools from South India (grades 3 to 9), which included both the genders. The experimental group underwent weekly 1-hour chess training for 1 year. Perceptual reasoning was measured by three subtests of WISC-IV INDIA. Pre-equivalence of means was established. Further statistical analyses revealed that the experimental group had shown statistically significant improvement in perceptual reasoning compared to the control group. The present study clearly establishes a correlation between chess learning and perceptual reasoning. If perceptual reasoning can be enhanced in children, it could possibly result in the improvement of executive functions as well as the scholastic performance of the child.

Keywords: chess, cognition, intelligence, perceptual reasoning

Procedia PDF Downloads 357
1503 Photoemission Momentum Microscopy of Graphene on Ir (111)

Authors: Anna V. Zaporozhchenko, Dmytro Kutnyakhov, Katherina Medjanik, Christian Tusche, Hans-Joachim Elmers, Olena Fedchenko, Sergey Chernov, Martin Ellguth, Sergej A. Nepijko, Gerd Schoenhense

Abstract:

Graphene reveals a unique electronic structure that predetermines many intriguing properties such as massless charge carriers, optical transparency and high velocity of fermions at the Fermi level, opening a wide horizon of future applications. Hence, a detailed investigation of the electronic structure of graphene is crucial. The method of choice is angular resolved photoelectron spectroscopy ARPES. Here we present experiments using time-of-flight (ToF) momentum microscopy, being an alternative way of ARPES using full-field imaging of the whole Brillouin zone (BZ) and simultaneous acquisition of up to several 100 energy slices. Unlike conventional ARPES, k-microscopy is not limited in simultaneous k-space access. We have recorded the whole first BZ of graphene on Ir(111) including all six Dirac cones. As excitation source we used synchrotron radiation from BESSY II (Berlin) at the U125-2 NIM, providing linearly polarized (both polarizations p- and s-) VUV radiation. The instrument uses a delay-line detector for single-particle detection up the 5 Mcps range and parallel energy detection via ToF recording. In this way, we gather a 3D data stack I(E,kx,ky) of the full valence electronic structure in approx. 20 mins. Band dispersion stacks were measured in the energy range of 14 eV up to 23 eV with steps of 1 eV. The linearly-dispersing graphene bands for all six K and K’ points were simultaneously recorded. We find clear features of hybridization with the substrate, in particular in the linear dichroism in the angular distribution (LDAD). Recording of the whole Brillouin zone of graphene/Ir(111) revealed new features. First, the intensity differences (i.e. the LDAD) are very sensitive to the interaction of graphene bands with substrate bands. Second, the dark corridors are investigated in detail for both, p- and s- polarized radiation. They appear as local distortions of photoelectron current distribution and are induced by quantum mechanical interference of graphene sublattices. The dark corridors are located in different areas of the 6 Dirac cones and show chirality behaviour with a mirror plane along vertical axis. Moreover, two out of six show an oval shape while the rest are more circular. It clearly indicates orientation dependence with respect to E vector of incident light. Third, a pattern of faint but very sharp lines is visible at energies around 22eV that strongly remind on Kikuchi lines in diffraction. In conclusion, the simultaneous study of all six Dirac cones is crucial for a complete understanding of dichroism phenomena and the dark corridor.

Keywords: band structure, graphene, momentum microscopy, LDAD

Procedia PDF Downloads 340
1502 A Life Cycle Assessment (LCA) of Aluminum Production Process

Authors: Alaa Al Hawari, Mohammad Khader, Wael El Hasan, Mahmoud Alijla, Ammar Manawi, Abdelbaki Benamour

Abstract:

The production of aluminium alloys and ingots -starting from the processing of alumina to aluminium, and the final cast product- was studied using a Life Cycle Assessment (LCA) approach. The studied aluminium supply chain consisted of a carbon plant, a reduction plant, a casting plant, and a power plant. In the LCA model, the environmental loads of the different plants for the production of 1 ton of aluminium metal were investigated. The impact of the aluminium production was assessed in eight impact categories. The results showed that for all of the impact categories the power plant had the highest impact only in the cases of Human Toxicity Potential (HTP) the reduction plant had the highest impact and in the Marine Aquatic Eco-Toxicity Potential (MAETP) the carbon plant had the highest impact. Furthermore, the impact of the carbon plant and the reduction plant combined was almost the same as the impact of the power plant in the case of the Acidification Potential (AP). The carbon plant had a positive impact on the environment when it comes to the Eutrophication Potential (EP) due to the production of clean water in the process. The natural gas based power plant used in the case study had 8.4 times less negative impact on the environment when compared to the heavy fuel based power plant and 10.7 times less negative impact when compared to the hard coal based power plant.

Keywords: life cycle assessment, aluminium production, supply chain, ecological impacts

Procedia PDF Downloads 532
1501 Analyzing the Mission Drift of Social Business: Case Study of Restaurant Providing Professional Training to At-Risk Youth

Authors: G. Yanay-Ventura, H. Desivilya Syna, K. Michael

Abstract:

Social businesses are based on the idea that an enterprise can be established for the sake of profit and, at the same time, with the aim of fulfilling social goals. Yet, the question of how these goals can be integrated in practice to derive parallel benefit in both realms still needs to be examined. Particularly notable in this context is the ‘governance challenge’ of social businesses, meaning the danger of the mission drifts from the social goal in the pursuit of good business. This study is based on an evaluation study of a social business that operates as a restaurant providing professional training to at-risk youth. The evaluation was based on the collection of a variety of data through interviews with stakeholders in the enterprise (directors and managers, business partners, social partners, and position holders in the restaurant and the social enterprise), a focus group consisting of the youth receiving the professional training, observations of the restaurant’s operation, and analysis of the social enterprise’s primary documents. The evaluation highlighted significant strengths of the social enterprise, including reaching relatively fast business sustainability, effective management of the restaurant, stable employment of the restaurant staff, and effective management of the social project. The social enterprise and business management have both enjoyed positive evaluations from a variety of stakeholders. Clearly, the restaurant was deemed by all a promising young business. However, the social project suffered from a 90% dropout rate among the youth entering its ranks, extreme monthly fluctuation in the number of youths participating, and a distinct minority of the youth who have succeeded in completing their training period. Possible explanations of the high dropout rate included the small number of cooks, which impeded the effectiveness of the training process and the provision of advanced cooking skills; lack of clarity regarding the essence and the elements of training; and lack of a meaningful peer group for the youth engaged in the program. Paradoxically, despite the stakeholders’ great appreciation for the social enterprise, the challenge of governability was also formidable, revealing a tangible risk of mission drift in the reduction of the social enterprise’s target population and a breach of the commitment made to the youth with regard to practical training. The risk of mission drifts emerged as a hidden and evasive issue for the stakeholders, who revealed a deep appreciation for the management and the outcomes of the social enterprise. The challenge of integration, therefore, requires an in-depth examination of how to maintain a successful business without hindering the achievement of the social goal. The study concludes that clear conceptualization of the training process and its aims, increased cooks’ participation in the social project, and novel conceptions with regard to the evaluation of success could serve to benefit the youth and impede mission drift.

Keywords: evaluation study, management, mission drift, social business

Procedia PDF Downloads 112
1500 Reliable and Error-Free Transmission through Multimode Polymer Optical Fibers in House Networks

Authors: Tariq Ahamad, Mohammed S. Al-Kahtani, Taisir Eldos

Abstract:

Optical communications technology has made enormous and steady progress for several decades, providing the key resource in our increasingly information-driven society and economy. Much of this progress has been in finding innovative ways to increase the data carrying capacity of a single optical fiber. In this research article we have explored basic issues in terms of security and reliability for secure and reliable information transfer through the fiber infrastructure. Conspicuously, one potentially enormous source of improvement has however been left untapped in these systems: fibers can easily support hundreds of spatial modes, but today’s commercial systems (single-mode or multi-mode) make no attempt to use these as parallel channels for independent signals. Bandwidth, performance, reliability, cost efficiency, resiliency, redundancy, and security are some of the demands placed on telecommunications today. Since its initial development, fiber optic systems have had the advantage of most of these requirements over copper-based and wireless telecommunications solutions. The largest obstacle preventing most businesses from implementing fiber optic systems was cost. With the recent advancements in fiber optic technology and the ever-growing demand for more bandwidth, the cost of installing and maintaining fiber optic systems has been reduced dramatically. With so many advantages, including cost efficiency, there will continue to be an increase of fiber optic systems replacing copper-based communications. This will also lead to an increase in the expertise and the technology needed to tap into fiber optic networks by intruders. As ever before, all technologies have been subject to hacking and criminal manipulation, fiber optics is no exception. Researching fiber optic security vulnerabilities suggests that not everyone who is responsible for their networks security is aware of the different methods that intruders use to hack virtually undetected into fiber optic cables. With millions of miles of fiber optic cables stretching across the globe and carrying information including but certainly not limited to government, military, and personal information, such as, medical records, banking information, driving records, and credit card information; being aware of fiber optic security vulnerabilities is essential and critical. Many articles and research still suggest that fiber optics is expensive, impractical and hard to tap. Others argue that it is not only easily done, but also inexpensive. This paper will briefly discuss the history of fiber optics, explain the basics of fiber optic technologies and then discuss the vulnerabilities in fiber optic systems and how they can be better protected. Knowing the security risks and knowing the options available may save a company a lot embarrassment, time, and most importantly money.

Keywords: in-house networks, fiber optics, security risk, money

Procedia PDF Downloads 420
1499 Urban Open Source: Synthesis of a Citizen-Centric Framework to Design Densifying Cities

Authors: Shaurya Chauhan, Sagar Gupta

Abstract:

Prominent urbanizing centres across the globe like Delhi, Dhaka, or Manila have exhibited that development often faces a challenge in bridging the gap among the top-down collective requirements of the city and the bottom-up individual aspirations of the ever-diversifying population. When this exclusion is intertwined with rapid urbanization and diversifying urban demography: unplanned sprawl, poor planning, and low-density development emerge as automated responses. In parallel, new ideas and methods of densification and public participation are being widely adopted as sustainable alternatives for the future of urban development. This research advocates a collaborative design method for future development: one that allows rapid application with its prototypical nature and an inclusive approach with mediation between the 'user' and the 'urban', purely with the use of empirical tools. Building upon the concepts and principles of 'open-sourcing' in design, the research establishes a design framework that serves the current user requirements while allowing for future citizen-driven modifications. This is synthesized as a 3-tiered model: user needs – design ideology – adaptive details. The research culminates into a context-responsive 'open source project development framework' (hereinafter, referred to as OSPDF) that can be used for on-ground field applications. To bring forward specifics, the research looks at a 300-acre redevelopment in the core of a rapidly urbanizing city as a case encompassing extreme physical, demographic, and economic diversity. The suggestive measures also integrate the region’s cultural identity and social character with the diverse citizen aspirations, using architecture and urban design tools, and references from recognized literature. This framework, based on a vision – feedback – execution loop, is used for hypothetical development at the five prevalent scales in design: master planning, urban design, architecture, tectonics, and modularity, in a chronological manner. At each of these scales, the possible approaches and avenues for open- sourcing are identified and validated, through hit-and-trial, and subsequently recorded. The research attempts to re-calibrate the architectural design process and make it more responsive and people-centric. Analytical tools such as Space, Event, and Movement by Bernard Tschumi and Five-Point Mental Map by Kevin Lynch, among others, are deep rooted in the research process. Over the five-part OSPDF, a two-part subsidiary process is also suggested after each cycle of application, for a continued appraisal and refinement of the framework and urban fabric with time. The research is an exploration – of the possibilities for an architect – to adopt the new role of a 'mediator' in development of the contemporary urbanity.

Keywords: open source, public participation, urbanization, urban development

Procedia PDF Downloads 149
1498 Random Forest Classification for Population Segmentation

Authors: Regina Chua

Abstract:

To reduce the costs of re-fielding a large survey, a Random Forest classifier was applied to measure the accuracy of classifying individuals into their assigned segments with the fewest possible questions. Given a long survey, one needed to determine the most predictive ten or fewer questions that would accurately assign new individuals to custom segments. Furthermore, the solution needed to be quick in its classification and usable in non-Python environments. In this paper, a supervised Random Forest classifier was modeled on a dataset with 7,000 individuals, 60 questions, and 254 features. The Random Forest consisted of an iterative collection of individual decision trees that result in a predicted segment with robust precision and recall scores compared to a single tree. A random 70-30 stratified sampling for training the algorithm was used, and accuracy trade-offs at different depths for each segment were identified. Ultimately, the Random Forest classifier performed at 87% accuracy at a depth of 10 with 20 instead of 254 features and 10 instead of 60 questions. With an acceptable accuracy in prioritizing feature selection, new tools were developed for non-Python environments: a worksheet with a formulaic version of the algorithm and an embedded function to predict the segment of an individual in real-time. Random Forest was determined to be an optimal classification model by its feature selection, performance, processing speed, and flexible application in other environments.

Keywords: machine learning, supervised learning, data science, random forest, classification, prediction, predictive modeling

Procedia PDF Downloads 94
1497 Multi-Label Approach to Facilitate Test Automation Based on Historical Data

Authors: Warda Khan, Remo Lachmann, Adarsh S. Garakahally

Abstract:

The increasing complexity of software and its applicability in a wide range of industries, e.g., automotive, call for enhanced quality assurance techniques. Test automation is one option to tackle the prevailing challenges by supporting test engineers with fast, parallel, and repetitive test executions. A high degree of test automation allows for a shift from mundane (manual) testing tasks to a more analytical assessment of the software under test. However, a high initial investment of test resources is required to establish test automation, which is, in most cases, a limitation to the time constraints provided for quality assurance of complex software systems. Hence, a computer-aided creation of automated test cases is crucial to increase the benefit of test automation. This paper proposes the application of machine learning for the generation of automated test cases. It is based on supervised learning to analyze test specifications and existing test implementations. The analysis facilitates the identification of patterns between test steps and their implementation with test automation components. For the test case generation, this approach exploits historical data of test automation projects. The identified patterns are the foundation to predict the implementation of unknown test case specifications. Based on this support, a test engineer solely has to review and parameterize the test automation components instead of writing them manually, resulting in a significant time reduction for establishing test automation. Compared to other generation approaches, this ML-based solution can handle different writing styles, authors, application domains, and even languages. Furthermore, test automation tools require expert knowledge by means of programming skills, whereas this approach only requires historical data to generate test cases. The proposed solution is evaluated using various multi-label evaluation criteria (EC) and two small-sized real-world systems. The most prominent EC is ‘Subset Accuracy’. The promising results show an accuracy of at least 86% for test cases, where a 1:1 relationship (Multi-Class) between test step specification and test automation component exists. For complex multi-label problems, i.e., one test step can be implemented by several components, the prediction accuracy is still at 60%. It is better than the current state-of-the-art results. It is expected the prediction quality to increase for larger systems with respective historical data. Consequently, this technique facilitates the time reduction for establishing test automation and is thereby independent of the application domain and project. As a work in progress, the next steps are to investigate incremental and active learning as additions to increase the usability of this approach, e.g., in case labelled historical data is scarce.

Keywords: machine learning, multi-class, multi-label, supervised learning, test automation

Procedia PDF Downloads 132
1496 Production and Characterization of Ce3+: Si2N2O Phosphors for White Light-Emitting Diodes

Authors: Alparslan A. Balta, Hilmi Yurdakul, Orkun Tunckan, Servet Turan, Arife Yurdakul

Abstract:

Si2N2O (Sinoite) is an inorganic-based oxynitride material that reveals promising phosphor candidates for white light-emitting diodes (WLEDs). However, there is now limited knowledge to explain the synthesis of Si2N2O for this purpose. Here, to the best of authors’ knowledge, we report the first time the production of Si2N2O based phosphors by CeO2, SiO2, Si3N4 from main starting powders, and Li2O sintering additive through spark plasma sintering (SPS) route. The processing parameters, e.g., pressure, temperature, and sintering time, were optimized to reach the monophase Si2N2O containing samples. The lattice parameter, crystallite size, and amount of formation phases were characterized in detail by X-ray diffraction (XRD). Grain morphology, particle size, and distribution were analyzed by scanning and transmission electron microscopes (SEM and TEM). Cathodoluminescence (CL) in SEM and photoluminescence (PL) analyses were conducted on the samples to determine the excitation, and emission characteristics of Ce3+ activated Si2N2O. Results showed that the Si2N2O phase in a maximum 90% ratio was obtained by sintering for 15 minutes at 1650oC under 30 MPa pressure. Based on the SEM-CL and PL measurements, Ce3+: Si2N2O phosphor shows a broad emission summit between 400-700 nm that corresponds to white light. The present research was supported by TUBITAK under project number 217M667.

Keywords: cerium, oxynitride, phosphors, sinoite, Si₂N₂O

Procedia PDF Downloads 107
1495 Synthesis and Characterization of pH-Sensitive Graphene Quantum Dot-Loaded Metal-Organic Frameworks for Targeted Drug Delivery and Fluorescent Imaging

Authors: Sayed Maeen Badshah, Kuen-Song Lin, Abrar Hussain, Jamshid Hussain

Abstract:

Liver cancer is a significant global health issue, ranking fifth in incidence and second in mortality. Effective therapeutic strategies are urgently needed to combat this disease, particularly in regions with high prevalence. This study focuses on developing and characterizing fluorescent organometallic frameworks as distinct drug delivery carriers with potential applications in both the treatment and biological imaging of liver cancer. This work introduces two distinct organometallic frameworks: the cake-shaped GQD@NH₂-MIL-125 and the cross-shaped M8U6/FM8U6. The GQD@NH₂-MIL-125 framework is particularly noteworthy for its high fluorescence, making it an effective tool for biological imaging. X-ray diffraction (XRD) analysis revealed specific diffraction peaks at 6.81ᵒ (011), 9.76ᵒ (002), and 11.69ᵒ (121), with an additional significant peak at 26ᵒ (2θ), corresponding to the carbon material. Morphological analysis using Field Emission Scanning Electron Microscopy (FE-SEM), and Transmission Electron Microscopy (TEM) demonstrated that the framework has a front particle size of 680 nm and a side particle size of 55±5 nm. High-resolution TEM (HR-TEM) images confirmed the successful attachment of graphene quantum dots (GQDs) onto the NH2-MIL-125 framework. Fourier-Transform Infrared (FT-IR) spectroscopy identified crucial functional groups within the GQD@NH₂-MIL-125 structure, including O-Ti-O metal bonds within the 500 to 700 cm⁻¹ range, and N-H and C-N bonds at 1,646 cm⁻¹ and 1,164 cm⁻¹, respectively. BET isotherm analysis further revealed a specific surface area of 338.1 m²/g and an average pore size of 46.86 nm. This framework also demonstrated UV-active properties, as identified by UV-visible light spectra, and its photoluminescence (PL) spectra showed an emission peak around 430 nm when excited at 350 nm, indicating its potential as a fluorescent drug delivery carrier. In parallel, the cross-shaped M8U6/FM8U6 frameworks were synthesized and characterized using X-ray diffraction, which identified distinct peaks at 2θ = 7.4 (111), 8.5 (200), 9.2 (002), 10.8 (002), 12.1 (220), 16.7 (103), and 17.1 (400). FE-SEM, HR-TEM, and TEM analyses revealed particle sizes of 350±50 nm for M8U6 and 200±50 nm for FM8U6. These frameworks, synthesized from terephthalic acid (H₂BDC), displayed notable vibrational bonds, such as C=O at 1,650 cm⁻¹, Fe-O in MIL-88 at 520 cm⁻¹, and Zr-O in UIO-66 at 482 cm⁻¹. BET analysis showed specific surface areas of 740.1 m²/g with a pore size of 22.92 nm for M8U6 and 493.9 m²/g with a pore size of 35.44 nm for FM8U6. Extended X-ray Absorption Fine Structure (EXAFS) spectra confirmed the stability of Ti-O bonds in the frameworks, with bond lengths of 2.026 Å for MIL-125, 1.962 Å for NH₂-MIL-125, and 1.817 Å for GQD@NH₂-MIL-125. These findings highlight the potential of these organometallic frameworks for enhanced liver cancer therapy through precise drug delivery and imaging, representing a significant advancement in nanomaterial applications in biomedical science.

Keywords: liver cancer cells, metal organic frameworks, Doxorubicin (DOX), drug release.

Procedia PDF Downloads 11
1494 Segmentation of Liver Using Random Forest Classifier

Authors: Gajendra Kumar Mourya, Dinesh Bhatia, Akash Handique, Sunita Warjri, Syed Achaab Amir

Abstract:

Nowadays, Medical imaging has become an integral part of modern healthcare. Abdominal CT images are an invaluable mean for abdominal organ investigation and have been widely studied in the recent years. Diagnosis of liver pathologies is one of the major areas of current interests in the field of medical image processing and is still an open problem. To deeply study and diagnose the liver, segmentation of liver is done to identify which part of the liver is mostly affected. Manual segmentation of the liver in CT images is time-consuming and suffers from inter- and intra-observer differences. However, automatic or semi-automatic computer aided segmentation of the Liver is a challenging task due to inter-patient Liver shape and size variability. In this paper, we present a technique for automatic segmenting the liver from CT images using Random Forest Classifier. Random forests or random decision forests are an ensemble learning method for classification that operate by constructing a multitude of decision trees at training time and outputting the class that is the mode of the classes of the individual trees. After comparing with various other techniques, it was found that Random Forest Classifier provide a better segmentation results with respect to accuracy and speed. We have done the validation of our results using various techniques and it shows above 89% accuracy in all the cases.

Keywords: CT images, image validation, random forest, segmentation

Procedia PDF Downloads 313
1493 Bioavailability of Iron in Some Selected Fiji Foods using In vitro Technique

Authors: Poonam Singh, Surendra Prasad, William Aalbersberg

Abstract:

Iron the most essential trace element in human nutrition. Its deficiency has serious health consequences and is a major public health threat worldwide. The common deficiencies in Fiji population reported are of Fe, Ca and Zn. It has also been reported that 40% of women in Fiji are iron deficient. Therefore, we have been studying the bioavailability of iron in commonly consumed Fiji foods. To study the bioavailability it is essential to assess the iron contents in raw foods. This paper reports the iron contents and its bioavailability in commonly consumed foods by multicultural population of Fiji. The food samples (rice, breads, wheat flour and breakfast cereals) were analyzed by atomic absorption spectrophotometer for total iron and its bioavailability. The white rice had the lowest total iron 0.10±0.03 mg/100g but had high bioavailability of 160.60±0.03%. The brown rice had 0.20±0.03 mg/100g total iron content but 85.00±0.03% bioavailable. The white and brown breads showed the highest iron bioavailability as 428.30±0.11 and 269.35 ±0.02%, respectively. The Weetabix and the rolled oats had the iron contents 2.89±0.27 and 1.24.±0.03 mg/100g with bioavailability of 14.19±0.04 and 12.10±0.03%, respectively. The most commonly consumed normal wheat flour had 0.65±0.00 mg/100g iron while the whole meal and the Roti flours had 2.35±0.20 and 0.62±0.17 mg/100g iron showing bioavailability of 55.38±0.05, 16.67±0.08 and 12.90±0.00%, respectively. The low bioavailability of iron in certain foods may be due to the presence of phytates/oxalates, processing/storage conditions, cooking method or interaction with other minerals present in the food samples.

Keywords: iron, bioavailability, Fiji foods, in vitro technique, human nutrition

Procedia PDF Downloads 529
1492 Functional Gene Expression in Human Cells Using Linear Vectors Derived from Bacteriophage N15 Processing

Authors: Kumaran Narayanan, Pei-Sheng Liew

Abstract:

This paper adapts the bacteriophage N15 protelomerase enzyme to assemble linear chromosomes as vectors for gene expression in human cells. Phage N15 has the unique ability to replicate as a linear plasmid with telomeres in E. coli during its prophage stage of life-cycle. The virus-encoded protelomerase enzyme cuts its circular genome and caps its ends to form hairpin telomeres, resulting in a linear human-chromosome-like structure in E. coli. In mammalian cells, however, no enzyme with TelN-like activities has been found. In this work, we show for the first-time transfer of the protelomerase from phage into human and mouse cells and demonstrate recapitulation of its activity in these hosts. The function of this enzyme is assayed by demonstrating cleavage of its target DNA, followed by detecting telomere formation based on its resistance to recBCD enzyme digestion. We show protelomerase expression persists for at least 60 days, which indicates limited silencing of its expression. Next, we show that an intact human β-globin gene delivered on this linear chromosome accurately retains its expression in the human cellular environment for at least 60 hours, demonstrating its stability and potential as a vector. These results demonstrate that the N15 protelomerse is able to function in mammalian cells to cut and heal DNA to create telomeres, which provides a new tool for creating novel structures by DNA resolution in these hosts.

Keywords: chromosome, beta-globin, DNA, gene expression, linear vector

Procedia PDF Downloads 192
1491 Influence of κ-Casein Genotype on Milk Productivity of Latvia Local Dairy Breeds

Authors: S. Petrovska, D. Jonkus, D. Smiltiņa

Abstract:

κ-casein is one of milk proteins which are very important for milk processing. Genotypes of κ-casein affect milk yield, fat, and protein content. The main factors which affect local Latvian dairy breed milk yield and composition are analyzed in research. Data were collected from 88 Latvian brown and 82 Latvian blue cows in 2015. AA genotype was 0.557 in Latvian brown and 0.232 in Latvian blue breed. BB genotype was 0.034 in Latvian brown and 0.207 in Latvian blue breed. Highest milk yield was observed in Latvian brown (5131.2 ± 172.01 kg), significantly high fat content and fat yield also was in Latvian brown (p < 0.05). Significant differences between κ-casein genotypes were not found in Latvian brown, but highest milk yield (5057 ± 130.23 kg), protein content (3.42 ± 0.03%), and protein yield (171.9 ± 4.34 kg) were with AB genotype. Significantly high fat content was observed in Latvian blue breed with BB genotype (4.29 ± 0.17%) compared with AA genotypes (3.42 ± 0.19). Similar tendency was found in protein content – 3.27 ± 0.16% with BB genotype and 2.59 ± 0.16% with AA genotype (p < 0.05). Milk yield increases by increasing parity. We did not obtain major tendency of changes of milk fat and protein content according parity.

Keywords: dairy cows, κ-casein, milk productivity, polymorphism

Procedia PDF Downloads 270