Search results for: data source
26881 Improving the Utilization of Telfairia occidentalis Leaf Meal with Cellulase-Glucanase-Xylanase Combination and Selected Probiotic in Broiler Diets
Authors: Ayodeji Fasuyi
Abstract:
Telfairia occidentalis is a leafy vegetable commonly grown in the tropics for nutritional benefits. The use of enzymes and probiotics is becoming prominent due to the ban on antibiotics as growth promoters in many parts of the world. It is conceived that with enzymes and probiotics additives, fibrous leafy vegetables can be incorporated into poultry feeds as protein source. However, certain antinutrients were also found in the leaves of Telfairia occidentalis. Four broiler starter and finisher diets were formulated for the two phases of the broiler experiments. A mixture of fiber degrading enzymes, Roxazyme G2 (combination of cellulase, glucanase and xylanase) and probiotics (Turbotox), a growth promoter, were used in broiler diets at 1:1. The Roxazyme G2/Turbotox mixtures were used in diets containing four varying levels of Telfairia occidentalis leaf meal (TOLM) at 0, 10, 20 and 30%. Diets 1 were standard broiler diets without TOLM and Roxazyme G2 and Turbotox additives. Diets 2, 3 and 4 had enzymes and probiotics additives. Certain mineral elements such as Ca, P, K, Na, Mg, Fe, Mn, Cu and Zn were found in notable quantities viz. 2.6 g/100 g, 1.2 g/100 g, 6.2 g/100 g, 5.1 g/100 g, 4.7 g/100 g, 5875 ppm, 182 ppm, 136 ppm and 1036 ppm, respectively. Phytin, phytin-P, oxalate, tannin and HCN were also found in ample quantities viz. 189.2 mg/100 g, 120.1 mg/100 g, 80.7 mg/100 g, 43.1 mg/100 g and 61.2 mg/100 g, respectively. The average weight gain was highest at 46.3 g/bird/day for birds on 10% TOLM diet but similar (P > 0.05) to 46.2 g/bird/day for birds on 20% TOLM. The feed conversion ratio (FCR) of 2.27 was the lowest and optimum for birds on 10% TOLM although similar (P > 0.05) to 2.29 obtained for birds on 20% TOLM. FCR of 2.61 was the highest at 2.61 for birds on 30% TOLM diet. The lowest FCR of 2.27 was obtained for birds on 10% TOLM diet although similar (P > 0.05) to 2.29 for birds on 20% TOLM diet. Most carcass characteristics and organ weights were similar (P > 0.05) for the experimental birds on the different diets except for kidney, gizzard and intestinal length. The values for kidney, gizzard and intestinal length were significantly higher (P < 0.05) for birds on the TOLM diets. The nitrogen retention had the highest value of 72.37 ± 0.10% for birds on 10% TOLM diet although similar (P > 0.05) to 71.54 ± 1.89 obtained for birds on the control diet without TOLM and enzymes/probiotics mixture. There was evidence of a better utilization of TOLM as a plant protein source. The carcass characteristics and organ weights all showed evidence of uniform tissue buildup and muscles development particularly in diets containing 10% of TOLM level. There was also better nitrogen utilization in birds on the 10% TOLM diet. Considering the cheap cost of TOLM, it is envisaged that its introduction into poultry feeds as a plant protein source will ultimately reduce the cost of poultry feeds.Keywords: Telfairia occidentalis leaf meal, enzymes, probiotics, additives
Procedia PDF Downloads 13626880 Hydrological Challenges and Solutions in the Nashik Region: A Multi Tracer and Geochemistry Approach to Groundwater Management
Authors: Gokul Prasad, Pennan Chinnasamy
Abstract:
The degradation of groundwater resources, attributed to factors such as excessive abstraction and contamination, has emerged as a global concern. This study delves into the stable isotopes of water) in a hard-rock aquifer situated in the Upper Godavari watershed, an agriculturally rich region in India underlain by Basalt. The higher groundwater draft (> 90%) poses significant risks; comprehending groundwater sources, flow patterns, and their environmental impacts is pivotal for researchers and water managers. The region has faced five droughts in the past 20 years; four are categorized as medium. The recharge rates are variable and show a very minimum contribution to groundwater. The rainfall pattern shows vast variability, with the region receiving seasonal monsoon rainfall for just four months and the rest of the year experiencing minimal rainfall. This research closely monitored monsoon precipitation inputs and examined spatial and temporal fluctuations in δ18O and δ2H in both groundwater and precipitation. By discerning individual recharge events during monsoons, it became possible to identify periods when evaporation led to groundwater quality deterioration, characterized by elevated salinity and stable isotope values in the return flow. The locally derived meteoric water line (LMWL) (δ2H = 6.72 * δ18O + 1.53, r² = 0.6) provided valuable insights into the groundwater system. The leftward shift of the Nashik LMWL in relation to the GMWL and LMWL indicated groundwater evaporation (-33 ‰), supported by spatial variations in electrical conductivity (EC) data. Groundwater in the eastern and northern watershed areas exhibited higher salinity > 3000uS/cm, expanding > 40% of the area compared to the western and southern regions due to geological disparities (alluvium vs basalt). The findings emphasize meteoric precipitation as the primary groundwater source in the watershed. However, spatial variations in isotope values and chemical constituents indicate other contributing factors, including evaporation, groundwater source type, and natural or anthropogenic (specifically agricultural and industrial) contaminants. Therefore, the study recommends focused hydro geochemistry and isotope analysis in areas with strong agricultural and industrial influence for the development of holistic groundwater management plans for protecting the groundwater aquifers' quantity and quality.Keywords: groundwater quality, stable isotopes, salinity, groundwater management, hard-rock aquifer
Procedia PDF Downloads 4726879 Analysis of Expression Data Using Unsupervised Techniques
Authors: M. A. I Perera, C. R. Wijesinghe, A. R. Weerasinghe
Abstract:
his study was conducted to review and identify the unsupervised techniques that can be employed to analyze gene expression data in order to identify better subtypes of tumors. Identifying subtypes of cancer help in improving the efficacy and reducing the toxicity of the treatments by identifying clues to find target therapeutics. Process of gene expression data analysis described under three steps as preprocessing, clustering, and cluster validation. Feature selection is important since the genomic data are high dimensional with a large number of features compared to samples. Hierarchical clustering and K Means are often used in the analysis of gene expression data. There are several cluster validation techniques used in validating the clusters. Heatmaps are an effective external validation method that allows comparing the identified classes with clinical variables and visual analysis of the classes.Keywords: cancer subtypes, gene expression data analysis, clustering, cluster validation
Procedia PDF Downloads 14926878 Ecotourism Development as an Alternative Livelihood for Guassa Community, Ethiopia
Authors: Abraham Kidane
Abstract:
The study aims at assessing the prospects and challenges of community-based ecotourism development in and around the Guassa Community Conservation Area (GCCA) for the establishment of alternative sources of livelihood for local people and the conservation of natural resources. The Guassa area and its surrounding area are endowed with natural, cultural, and religious tourism resources. The study is descriptive in its design and uses both qualitative and quantitative research methods. Interviews and questionnaires were used as an instrument for data gathering. The interview was undertaken with government officials, NGO officials, and experts, with three local community representatives. The three Kebeles of Guassa were chosen using purposive sampling because of the fact that they are immediate neighbors to GCCA, and hence, 150 questionnaires were administered proportionally to the household numbers in each kebeles. The perspectives of the MoCT, EWCA, and some Tour Operation agencies were uncovered through questionnaires; for each of them, five questionnaires were administered, and all the returns were used in the analysis. Frequency, percentage, average mean, One Way-ANOVA, and independent t-test are used to analyze quantitative data. The findings revealed that food insecurity is commonplace in the study area. The local people's reliance on the conservation area’s resources has been increasing, and the area size is also dwindling from time to time. On the other hand, the local people's levels of awareness about Community-Based Ecotourism (CBET) are low. In addition, the local capacity in relation to conservation and CBET development is also low, even though there is inadequate training offered by the government and NGOs. In general, tourism is not yet considered an alternative source of income and a means of conserving natural resources. In addition, the challenges for CBET development apart from low awareness level about CBET and low capacity, poor infrastructure, and poor tourism facilities were also identified as challenges in the study area.Keywords: ecotourism, CBET, alternative livelihood, conservation
Procedia PDF Downloads 9926877 Learning Analytics in a HiFlex Learning Environment
Authors: Matthew Montebello
Abstract:
Student engagement within a virtual learning environment generates masses of data points that can significantly contribute to the learning analytics that lead to decision support. Ideally, similar data is collected during student interaction with a physical learning space, and as a consequence, data is present at a large scale, even in relatively small classes. In this paper, we report of such an occurrence during classes held in a HiFlex modality as we investigate the advantages of adopting such a methodology. We plan to take full advantage of the learner-generated data in an attempt to further enhance the effectiveness of the adopted learning environment. This could shed crucial light on operating modalities that higher education institutions around the world will switch to in a post-COVID era.Keywords: HiFlex, big data in higher education, learning analytics, virtual learning environment
Procedia PDF Downloads 20126876 Li-Fi Technology: Data Transmission through Visible Light
Authors: Shahzad Hassan, Kamran Saeed
Abstract:
People are always in search of Wi-Fi hotspots because Internet is a major demand nowadays. But like all other technologies, there is still room for improvement in the Wi-Fi technology with regards to the speed and quality of connectivity. In order to address these aspects, Harald Haas, a professor at the University of Edinburgh, proposed what we know as the Li-Fi (Light Fidelity). Li-Fi is a new technology in the field of wireless communication to provide connectivity within a network environment. It is a two-way mode of wireless communication using light. Basically, the data is transmitted through Light Emitting Diodes which can vary the intensity of light very fast, even faster than the blink of an eye. From the research and experiments conducted so far, it can be said that Li-Fi can increase the speed and reliability of the transfer of data. This paper pays particular attention on the assessment of the performance of this technology. In other words, it is a 5G technology which uses LED as the medium of data transfer. For coverage within the buildings, Wi-Fi is good but Li-Fi can be considered favorable in situations where large amounts of data are to be transferred in areas with electromagnetic interferences. It brings a lot of data related qualities such as efficiency, security as well as large throughputs to the table of wireless communication. All in all, it can be said that Li-Fi is going to be a future phenomenon where the presence of light will mean access to the Internet as well as speedy data transfer.Keywords: communication, LED, Li-Fi, Wi-Fi
Procedia PDF Downloads 34726875 VIAN-DH: Computational Multimodal Conversation Analysis Software and Infrastructure
Authors: Teodora Vukovic, Christoph Hottiger, Noah Bubenhofer
Abstract:
The development of VIAN-DH aims at bridging two linguistic approaches: conversation analysis/interactional linguistics (IL), so far a dominantly qualitative field, and computational/corpus linguistics and its quantitative and automated methods. Contemporary IL investigates the systematic organization of conversations and interactions composed of speech, gaze, gestures, and body positioning, among others. These highly integrated multimodal behaviour is analysed based on video data aimed at uncovering so called “multimodal gestalts”, patterns of linguistic and embodied conduct that reoccur in specific sequential positions employed for specific purposes. Multimodal analyses (and other disciplines using videos) are so far dependent on time and resource intensive processes of manual transcription of each component from video materials. Automating these tasks requires advanced programming skills, which is often not in the scope of IL. Moreover, the use of different tools makes the integration and analysis of different formats challenging. Consequently, IL research often deals with relatively small samples of annotated data which are suitable for qualitative analysis but not enough for making generalized empirical claims derived quantitatively. VIAN-DH aims to create a workspace where many annotation layers required for the multimodal analysis of videos can be created, processed, and correlated in one platform. VIAN-DH will provide a graphical interface that operates state-of-the-art tools for automating parts of the data processing. The integration of tools that already exist in computational linguistics and computer vision, facilitates data processing for researchers lacking programming skills, speeds up the overall research process, and enables the processing of large amounts of data. The main features to be introduced are automatic speech recognition for the transcription of language, automatic image recognition for extraction of gestures and other visual cues, as well as grammatical annotation for adding morphological and syntactic information to the verbal content. In the ongoing instance of VIAN-DH, we focus on gesture extraction (pointing gestures, in particular), making use of existing models created for sign language and adapting them for this specific purpose. In order to view and search the data, VIAN-DH will provide a unified format and enable the import of the main existing formats of annotated video data and the export to other formats used in the field, while integrating different data source formats in a way that they can be combined in research. VIAN-DH will adapt querying methods from corpus linguistics to enable parallel search of many annotation levels, combining token-level and chronological search for various types of data. VIAN-DH strives to bring crucial and potentially revolutionary innovation to the field of IL, (that can also extend to other fields using video materials). It will allow the processing of large amounts of data automatically and, the implementation of quantitative analyses, combining it with the qualitative approach. It will facilitate the investigation of correlations between linguistic patterns (lexical or grammatical) with conversational aspects (turn-taking or gestures). Users will be able to automatically transcribe and annotate visual, spoken and grammatical information from videos, and to correlate those different levels and perform queries and analyses.Keywords: multimodal analysis, corpus linguistics, computational linguistics, image recognition, speech recognition
Procedia PDF Downloads 10826874 An Analysis of Humanitarian Data Management of Polish Non-Governmental Organizations in Ukraine Since February 2022 and Its Relevance for Ukrainian Humanitarian Data Ecosystem
Authors: Renata Kurpiewska-Korbut
Abstract:
Making an assumption that the use and sharing of data generated in humanitarian action constitute a core function of humanitarian organizations, the paper analyzes the position of the largest Polish humanitarian non-governmental organizations in the humanitarian data ecosystem in Ukraine and their approach to non-personal and personal data management since February of 2022. Both expert interviews and document analysis of non-profit organizations providing a direct response in the Ukrainian crisis context, i.e., the Polish Humanitarian Action, Caritas, Polish Medical Mission, Polish Red Cross, and the Polish Center for International Aid and the applicability of theoretical perspective of contingency theory – with its central point that the context or specific set of conditions determining the way of behavior and the choice of methods of action – help to examine the significance of data complexity and adaptive approach to data management by relief organizations in the humanitarian supply chain network. The purpose of this study is to determine how the existence of well-established and accurate internal procedures and good practices of using and sharing data (including safeguards for sensitive data) by the surveyed organizations with comparable human and technological capabilities are implemented and adjusted to Ukrainian humanitarian settings and data infrastructure. The study also poses a fundamental question of whether this crisis experience will have a determining effect on their future performance. The obtained finding indicate that Polish humanitarian organizations in Ukraine, which have their own unique code of conduct and effective managerial data practices determined by contingencies, have limited influence on improving the situational awareness of other assistance providers in the data ecosystem despite their attempts to undertake interagency work in the area of data sharing.Keywords: humanitarian data ecosystem, humanitarian data management, polish NGOs, Ukraine
Procedia PDF Downloads 9226873 An Approach for Estimation in Hierarchical Clustered Data Applicable to Rare Diseases
Authors: Daniel C. Bonzo
Abstract:
Practical considerations lead to the use of unit of analysis within subjects, e.g., bleeding episodes or treatment-related adverse events, in rare disease settings. This is coupled with data augmentation techniques such as extrapolation to enlarge the subject base. In general, one can think about extrapolation of data as extending information and conclusions from one estimand to another estimand. This approach induces hierarchichal clustered data with varying cluster sizes. Extrapolation of clinical trial data is being accepted increasingly by regulatory agencies as a means of generating data in diverse situations during drug development process. Under certain circumstances, data can be extrapolated to a different population, a different but related indication, and different but similar product. We consider here the problem of estimation (point and interval) using a mixed-models approach under an extrapolation. It is proposed that estimators (point and interval) be constructed using weighting schemes for the clusters, e.g., equally weighted and with weights proportional to cluster size. Simulated data generated under varying scenarios are then used to evaluate the performance of this approach. In conclusion, the evaluation result showed that the approach is a useful means for improving statistical inference in rare disease settings and thus aids not only signal detection but risk-benefit evaluation as well.Keywords: clustered data, estimand, extrapolation, mixed model
Procedia PDF Downloads 13626872 Knowledge Spillovers from Patent Citations: Evidence from Swiss Manufacturing Industry
Authors: Racha Khairallah, Lamia Ben Hamida
Abstract:
Our paper attempts to examine how Swiss manufacturing firms manage to learn from patent citations to improve their innovation performance. We argue that the assessment of these effects needs a detailed analysis of spillovers according to the source of knowledge with respect to formal and informal patent citations made in European and internal search, the horizontal and vertical mechanisms by which knowledge spillovers take place, and the technological characteristics of innovative firms that able them to absorb external knowledge and integrate it in their existing innovation process. We use OECD data and find evidence that knowledge spillovers occur only from horizontal and backward linkages. The importance of these effects depends on the type of citation, in which the references to non-patent literature (informal citations made in European and international searches) have a greater impact. In addition, only firms with high technological capacities benefit from knowledge spillovers from formal and informal citations. Low-technology firms fail to catch up and efficiently learn external knowledge from patent citations.Keywords: innovation performance, patent citation, absorptive capacity, knowledge spillover mechanisms
Procedia PDF Downloads 10926871 Design of a Service-Enabled Dependable Integration Environment
Authors: Fuyang Peng, Donghong Li
Abstract:
The aim of information systems integration is to make all the data sources, applications and business flows integrated into the new environment so that unwanted redundancies are reduced and bottlenecks and mismatches are eliminated. Two issues have to be dealt with to meet such requirements: the software architecture that supports resource integration, and the adaptor development tool that help integration and migration of legacy applications. In this paper, a service-enabled dependable integration environment (SDIE), is presented, which has two key components, i.e., a dependable service integration platform and a legacy application integration tool. For the dependable platform for service integration, the service integration bus, the service management framework, the dependable engine for service composition, and the service registry and discovery components are described. For the legacy application integration tool, its basic organization, functionalities and dependable measures taken are presented. Due to its service-oriented integration model, the light-weight extensible container, the service component combination-oriented p-lattice structure, and other features, SDIE has advantages in openness, flexibility, performance-price ratio and feature support over commercial products, is better than most of the open source integration software in functionality, performance and dependability support.Keywords: application integration, dependability, legacy, SOA
Procedia PDF Downloads 36026870 Subsurface Structures Delineation and Tectonic History Investigation Using Gravity, Magnetic and Well Data, In the Cyrenaica Platform, NE Libya
Authors: Mohamed Abdalla saleem
Abstract:
Around one hundred wells were drilled in the Cyrenaica platform north-east Libya, and almost all of them were dry. Although the drilled samples reveal good oil shows and good source rock maturity. Most of the upper Cretaceous age and the above deposit successions are outcrops in different places. We have a thorough understanding and mapping of the structures related to the Cretaceous and above Cenozoic Era. But the subsurface beneath these outcrops still needs more investigation and delineation. This study aims to give answers to some questions about the tectonic history and the types of structures that are distributed in the area using gravity, magnetic, and well data. According to the information that has been obtained from groups of wells drilled in concessions 31, 35, and 37, one can note that the depositional sections become ticker and deeper southward. The topography map of the study area shows that the area is highly elevated at the north, about 300 m above the sea level, while the minimum elevation (16–18 m) exists nearly in the middle (lat. 30°). South to this latitude, the area is started elevated again (more than 100 m). The third-order residual gravity map, which was constructed from the Bouguer gravity map, reveals that the area is dominated by a large negative anomaly working as a sub-basin (245 km x 220 km), which means a very thick depositional section, and the basement is very deep. The mentioned depocenter is surrounded by four high gravity anomalies (12-37 mGal), which means a shallow basement and a relative thinner succession of sediments. The highest gravity values are located beside the coast line. The total horizontal gradient (THG) map reveals various systems of structures, the first system where the structures are oriented NE-SW, which is crosscut by the second regime extending NW-SE. This second system is distributed through the whole area, but it is very strong and shallow near the coast line and at the south part, while it is relatively deep at the middle depocenter area.Keywords: cyrenaica platform, gravity, structures, basement, tectonic history
Procedia PDF Downloads 126869 Authorization of Commercial Communication Satellite Grounds for Promoting Turkish Data Relay System
Authors: Celal Dudak, Aslı Utku, Burak Yağlioğlu
Abstract:
Uninterrupted and continuous satellite communication through the whole orbit time is becoming more indispensable every day. Data relay systems are developed and built for various high/low data rate information exchanges like TDRSS of USA and EDRSS of Europe. In these missions, a couple of task-dedicated communication satellites exist. In this regard, for Turkey a data relay system is attempted to be defined exchanging low data rate information (i.e. TTC) for Earth-observing LEO satellites appointing commercial GEO communication satellites all over the world. First, justification of this attempt is given, demonstrating duration enhancements in the link. Discussion of preference of RF communication is, also, given instead of laser communication. Then, preferred communication GEOs – including TURKSAT4A already belonging to Turkey- are given, together with the coverage enhancements through STK simulations and the corresponding link budget. Also, a block diagram of the communication system is given on the LEO satellite.Keywords: communication, GEO satellite, data relay system, coverage
Procedia PDF Downloads 44226868 The Development of Encrypted Near Field Communication Data Exchange Format Transmission in an NFC Passive Tag for Checking the Genuine Product
Authors: Tanawat Hongthai, Dusit Thanapatay
Abstract:
This paper presents the development of encrypted near field communication (NFC) data exchange format transmission in an NFC passive tag for the feasibility of implementing a genuine product authentication. We propose a research encryption and checking the genuine product into four major categories; concept, infrastructure, development and applications. This result shows the passive NFC-forum Type 2 tag can be configured to be compatible with the NFC data exchange format (NDEF), which can be automatically partially data updated when there is NFC field.Keywords: near field communication, NFC data exchange format, checking the genuine product, encrypted NFC
Procedia PDF Downloads 28026867 Examining Whether the Reflection Activities Help and Encourage Students’ Writing and Critical Thinking Skills Within the Law faculty, 3rd year students
Authors: Motlatjo Ntatamala, Natasha Ravyse, Michael Laubsher
Abstract:
As much as students are being assessed through reflective activities, it is important to examine and check if those very same activities really assist in influencing and shaping both their writing and critical thinking skills. The skills which students will acquire from the reflective activities will not only be beneficial for the present or immediate moment, but they will also carry them over to their 4th year of writing a mini dissertation and in future for those who would want to explore their post-graduate studies. Thus, the only way to source the reliable and raw feedback on whether students think the reflective activities help them think about their writing and critical thinking skills is to get a direct students’ perspective by analysing their submitted reflective activities. Writing a research proposal implies that critical thinking is a talent that will grow in a holistic manner, as evidenced by previous studies. However, no research has been conducted to investigate the impact of critical thinking on legal writing skills in the South African setting. This study seeks to examine the effectiveness of the reflective activities in 3rd years’ students’ writing and towards their critical thinking. The proposed paper aims to examine the effectiveness of the reflection activities as an encouragement and motivation to their both writing and thinking skills. The paper will make use of students’ activities as a means of data collection and the activities will thus be analysed.Keywords: reflection activities, writing skills, critical thinking skills, reflective thinking
Procedia PDF Downloads 8126866 Data Hiding by Vector Quantization in Color Image
Authors: Yung Gi Wu
Abstract:
With the growing of computer and network, digital data can be spread to anywhere in the world quickly. In addition, digital data can also be copied or tampered easily so that the security issue becomes an important topic in the protection of digital data. Digital watermark is a method to protect the ownership of digital data. Embedding the watermark will influence the quality certainly. In this paper, Vector Quantization (VQ) is used to embed the watermark into the image to fulfill the goal of data hiding. This kind of watermarking is invisible which means that the users will not conscious the existing of embedded watermark even though the embedded image has tiny difference compared to the original image. Meanwhile, VQ needs a lot of computation burden so that we adopt a fast VQ encoding scheme by partial distortion searching (PDS) and mean approximation scheme to speed up the data hiding process. The watermarks we hide to the image could be gray, bi-level and color images. Texts are also can be regarded as watermark to embed. In order to test the robustness of the system, we adopt Photoshop to fulfill sharpen, cropping and altering to check if the extracted watermark is still recognizable. Experimental results demonstrate that the proposed system can resist the above three kinds of tampering in general cases.Keywords: data hiding, vector quantization, watermark, color image
Procedia PDF Downloads 36426865 Digital Preservation in Nigeria Universities Libraries: A Comparison between University of Nigeria Nsukka and Ahmadu Bello University Zaria
Authors: Suleiman Musa, Shuaibu Sidi Safiyanu
Abstract:
This study examined the digital preservation in Nigeria university libraries. A comparison between the university of Nigeria Nsukka (UNN) and Ahmadu Bello University Zaria (ABU, Zaria). The study utilized primary source of data obtained from two selected institution librarians. Finding revealed varying results in terms of skills acquired by librarians before and after digitization of the two institutions. The study reports that journals publication, text book, CD-ROMS, conference papers and proceedings, theses, dissertations and seminar papers are among the information resources available for digitization. The study further documents that copyright issue, power failure, and unavailability of needed materials are among the challenges facing the digitization of library of the institution. On the basis of the finding, the study concluded that digitization of library enhances efficiency in organization and retrieval of information services. The study therefore recommended that software should be upgraded with backup, training of the librarians on digital process, installation of antivirus and enhancement of technical collaboration between the library and MIS.Keywords: digitalization, preservation, libraries, comparison
Procedia PDF Downloads 33926864 Optimization of Extraction Conditions and Characteristics of Scale collagen From Sardine: Sardina pilchardus
Authors: F. Bellali, M. Kharroubi, M. Loutfi, N.Bourhim
Abstract:
In Morocco, fish processing industry is an important source income for a large amount of byproducts including skins, bones, heads, guts and scales. Those underutilized resources particularly scales contain a large amount of proteins and calcium. Scales from Sardina plichardus resulting from the transformation operation have the potential to be used as raw material for the collagen production. Taking into account this strong expectation of the regional fish industry, scales sardine upgrading is well justified. In addition, political and societal demands for sustainability and environment-friendly industrial production systems, coupled with the depletion of fish resources, drive this trend forward. Therefore, fish scale used as a potential source to isolate collagen has a wide large of applications in food, cosmetic and bio medical industry. The main aim of this study is to isolate and characterize the acid solubilize collagen from sardine fish scale, Sardina pilchardus. Experimental design methodology was adopted in collagen processing for extracting optimization. The first stage of this work is to investigate the optimization conditions of the sardine scale deproteinization on using response surface methodology (RSM). The second part focus on the demineralization with HCl solution or EDTA. Moreover, the last one is to establish the optimum condition for the isolation of collagen from fish scale by solvent extraction. The basic principle of RSM is to determinate model equations that describe interrelations between the independent variables and the dependent variables.Keywords: Sardina pilchardus, scales, valorization, collagen extraction, response surface methodology
Procedia PDF Downloads 41726863 Comparison of Agree Method and Shortest Path Method for Determining the Flow Direction in Basin Morphometric Analysis: Case Study of Lower Tapi Basin, Western India
Authors: Jaypalsinh Parmar, Pintu Nakrani, Bhaumik Shah
Abstract:
Digital Elevation Model (DEM) is elevation data of the virtual grid on the ground. DEM can be used in application in GIS such as hydrological modelling, flood forecasting, morphometrical analysis and surveying etc.. For morphometrical analysis the stream flow network plays a very important role. DEM lacks accuracy and cannot match field data as it should for accurate results of morphometrical analysis. The present study focuses on comparing the Agree method and the conventional Shortest path method for finding out morphometric parameters in the flat region of the Lower Tapi Basin which is located in the western India. For the present study, open source SRTM (Shuttle Radar Topography Mission with 1 arc resolution) and toposheets issued by Survey of India (SOI) were used to determine the morphometric linear aspect such as stream order, number of stream, stream length, bifurcation ratio, mean stream length, mean bifurcation ratio, stream length ratio, length of overland flow, constant of channel maintenance and aerial aspect such as drainage density, stream frequency, drainage texture, form factor, circularity ratio, elongation ratio, shape factor and relief aspect such as relief ratio, gradient ratio and basin relief for 53 catchments of Lower Tapi Basin. Stream network was digitized from the available toposheets. Agree DEM was created by using the SRTM and stream network from the toposheets. The results obtained were used to demonstrate a comparison between the two methods in the flat areas.Keywords: agree method, morphometric analysis, lower Tapi basin, shortest path method
Procedia PDF Downloads 23926862 Anomaly Detection in a Data Center with a Reconstruction Method Using a Multi-Autoencoders Model
Authors: Victor Breux, Jérôme Boutet, Alain Goret, Viviane Cattin
Abstract:
Early detection of anomalies in data centers is important to reduce downtimes and the costs of periodic maintenance. However, there is little research on this topic and even fewer on the fusion of sensor data for the detection of abnormal events. The goal of this paper is to propose a method for anomaly detection in data centers by combining sensor data (temperature, humidity, power) and deep learning models. The model described in the paper uses one autoencoder per sensor to reconstruct the inputs. The auto-encoders contain Long-Short Term Memory (LSTM) layers and are trained using the normal samples of the relevant sensors selected by correlation analysis. The difference signal between the input and its reconstruction is then used to classify the samples using feature extraction and a random forest classifier. The data measured by the sensors of a data center between January 2019 and May 2020 are used to train the model, while the data between June 2020 and May 2021 are used to assess it. Performances of the model are assessed a posteriori through F1-score by comparing detected anomalies with the data center’s history. The proposed model outperforms the state-of-the-art reconstruction method, which uses only one autoencoder taking multivariate sequences and detects an anomaly with a threshold on the reconstruction error, with an F1-score of 83.60% compared to 24.16%.Keywords: anomaly detection, autoencoder, data centers, deep learning
Procedia PDF Downloads 19426861 The Jordanian Traditional Dress of Women as a Form of Cultural Heritage
Authors: Sarah Alkhateeb
Abstract:
This research explores the Jordanian traditional dress of women as a form of cultural heritage. The dress of the Jordanian woman expresses her social and cultural functions and reflects the local environment in its social and cultural frameworks and the determinants of the natural formation of climate and terrain, in addition to what is expressed by the person’s social status and position in the social ladder of any society. Therefore, the traditional dress of Jordanian women is distinguished by its abundance and diversity. Few studies have been conducted on the Jordanian traditional dress of women, the lack of studies about the Jordanian traditional dress of women needs highlighting and the characteristics of this dress have to be featured and documented as a part of cultural heritage. The main aim of this research is to contribute or to develop a conservation strategy to save this part of cultural heritage from loss. In this research, the qualitative method approach will be used and will follow the ethnographic method. The data will be gathered from a primary source which is the single focus group discussion with the TIRAZ museum team; the Jordanian traditional dress will be explored across three regions: The North, Middle and South of Jordan, investigating the regional differences and focusing on the details of the individual garment.Keywords: Jordanian traditional dress, cultural heritage, tiraz museum, ethnographic method
Procedia PDF Downloads 16626860 Transferring Cultural Meanings: A Case of Translation Classroom
Authors: Ramune Kasperaviciene, Jurgita Motiejuniene, Dalia Venckiene
Abstract:
Familiarising students with strategies for transferring cultural meanings (intertextual units, culture-specific idioms, culture-specific items, etc.) should be part of a comprehensive translator training programme. The present paper focuses on strategies for transferring such meanings into other languages and explores possibilities for introducing these methods and practice to translation students. The authors (university translation teachers) analyse the means of transferring cultural meanings from English into Lithuanian in a specific travel book, attribute these means to theoretically grounded strategies, and make calculations related to the frequency of adoption of specific strategies; translation students are familiarised with concepts and methods related to transferring cultural meanings and asked to put their theoretical knowledge into practice, i.e. interpret and translate certain culture-specific items from the same source text, and ground their decisions on theory; the comparison of the strategies employed by the professional translator of the source text (as identified by the authors of this study) and by the students is made. As a result, both students and teachers gain valuable experience, and new practices of conducting translation classes for a specific purpose evolve. Conclusions highlight the differences and similarities of non-professional and professional choices, summarise the possibilities for introducing methods of transferring cultural meanings to students, and round up with specific considerations of the impact of theoretical knowledge and the degree of experience on decisions made in the translation process.Keywords: cultural meanings, culture-specific items, strategies for transferring cultural meanings, translator training
Procedia PDF Downloads 35026859 Comparative Study of Iran and Turkey Advantages to Attract Foreign Investors
Authors: Alireza Saviz, Sedigheh Zarei
Abstract:
Foreign Direct Investment (FDI) is an integral part of an open and effective international economic system and a major catalyst to development. Developing countries, emerging economies and countries in transition have come increasingly to see FDI as a source of economic development modernization, income growth and employment. FDI is an important vehicle for the transfer of technology, contributing relatively more to growth than domestic investment. Exploratory research is being conducted here. The data for the study is collected from secondary sources like research papers, journals, websites and reports. This paper aim was to generate knowledge on Iran’s situation through these factors after lifting sanction in comparison to Turkey. Although the most important factors that influence foreign investor decisions vary depending on the countries, sectors, years, and the objective of investor, nowadays governments should pay more attention to human resources education, marketing, infrastructure and administrative process in order to attracting foreign investors. A proper understanding of these findings will help governments to create appropriate policies in order to encourage more foreign investorsKeywords: foreign direct investment, host country, competitive advantage, FDI
Procedia PDF Downloads 48726858 The Use of Fractional Brownian Motion in the Generation of Bed Topography for Bodies of Water Coupled with the Lattice Boltzmann Method
Authors: Elysia Barker, Jian Guo Zhou, Ling Qian, Steve Decent
Abstract:
A method of modelling topography used in the simulation of riverbeds is proposed in this paper, which removes the need for datapoints and measurements of physical terrain. While complex scans of the contours of a surface can be achieved with other methods, this requires specialised tools, which the proposed method overcomes by using fractional Brownian motion (FBM) as a basis to estimate the real surface within a 15% margin of error while attempting to optimise algorithmic efficiency. This removes the need for complex, expensive equipment and reduces resources spent modelling bed topography. This method also accounts for the change in topography over time due to erosion, sediment transport, and other external factors which could affect the topography of the ground by updating its parameters and generating a new bed. The lattice Boltzmann method (LBM) is used to simulate both stationary and steady flow cases in a side-by-side comparison over the generated bed topography using the proposed method and a test case taken from an external source. The method, if successful, will be incorporated into the current LBM program used in the testing phase, which will allow an automatic generation of topography for the given situation in future research, removing the need for bed data to be specified.Keywords: bed topography, FBM, LBM, shallow water, simulations
Procedia PDF Downloads 9826857 Transforming Data into Knowledge: Mathematical and Statistical Innovations in Data Analytics
Authors: Zahid Ullah, Atlas Khan
Abstract:
The rapid growth of data in various domains has created a pressing need for effective methods to transform this data into meaningful knowledge. In this era of big data, mathematical and statistical innovations play a crucial role in unlocking insights and facilitating informed decision-making in data analytics. This abstract aims to explore the transformative potential of these innovations and their impact on converting raw data into actionable knowledge. Drawing upon a comprehensive review of existing literature, this research investigates the cutting-edge mathematical and statistical techniques that enable the conversion of data into knowledge. By evaluating their underlying principles, strengths, and limitations, we aim to identify the most promising innovations in data analytics. To demonstrate the practical applications of these innovations, real-world datasets will be utilized through case studies or simulations. This empirical approach will showcase how mathematical and statistical innovations can extract patterns, trends, and insights from complex data, enabling evidence-based decision-making across diverse domains. Furthermore, a comparative analysis will be conducted to assess the performance, scalability, interpretability, and adaptability of different innovations. By benchmarking against established techniques, we aim to validate the effectiveness and superiority of the proposed mathematical and statistical innovations in data analytics. Ethical considerations surrounding data analytics, such as privacy, security, bias, and fairness, will be addressed throughout the research. Guidelines and best practices will be developed to ensure the responsible and ethical use of mathematical and statistical innovations in data analytics. The expected contributions of this research include advancements in mathematical and statistical sciences, improved data analysis techniques, enhanced decision-making processes, and practical implications for industries and policymakers. The outcomes will guide the adoption and implementation of mathematical and statistical innovations, empowering stakeholders to transform data into actionable knowledge and drive meaningful outcomes.Keywords: data analytics, mathematical innovations, knowledge extraction, decision-making
Procedia PDF Downloads 7526856 FCNN-MR: A Parallel Instance Selection Method Based on Fast Condensed Nearest Neighbor Rule
Authors: Lu Si, Jie Yu, Shasha Li, Jun Ma, Lei Luo, Qingbo Wu, Yongqi Ma, Zhengji Liu
Abstract:
Instance selection (IS) technique is used to reduce the data size to improve the performance of data mining methods. Recently, to process very large data set, several proposed methods divide the training set into some disjoint subsets and apply IS algorithms independently to each subset. In this paper, we analyze the limitation of these methods and give our viewpoint about how to divide and conquer in IS procedure. Then, based on fast condensed nearest neighbor (FCNN) rule, we propose a large data sets instance selection method with MapReduce framework. Besides ensuring the prediction accuracy and reduction rate, it has two desirable properties: First, it reduces the work load in the aggregation node; Second and most important, it produces the same result with the sequential version, which other parallel methods cannot achieve. We evaluate the performance of FCNN-MR on one small data set and two large data sets. The experimental results show that it is effective and practical.Keywords: instance selection, data reduction, MapReduce, kNN
Procedia PDF Downloads 25326855 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking
Authors: Trevor Toy, Josef Langerman
Abstract:
Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.Keywords: big data markets, open banking, blockchain, personal data management
Procedia PDF Downloads 7326854 Experimental Evaluation of Succinct Ternary Tree
Authors: Dmitriy Kuptsov
Abstract:
Tree data structures, such as binary or in general k-ary trees, are essential in computer science. The applications of these data structures can range from data search and retrieval to sorting and ranking algorithms. Naive implementations of these data structures can consume prohibitively large volumes of random access memory limiting their applicability in certain solutions. Thus, in these cases, more advanced representation of these data structures is essential. In this paper we present the design of the compact version of ternary tree data structure and demonstrate the results for the experimental evaluation using static dictionary problem. We compare these results with the results for binary and regular ternary trees. The conducted evaluation study shows that our design, in the best case, consumes up to 12 times less memory (for the dictionary used in our experimental evaluation) than a regular ternary tree and in certain configuration shows performance comparable to regular ternary trees. We have evaluated the performance of the algorithms using both 32 and 64 bit operating systems.Keywords: algorithms, data structures, succinct ternary tree, per- formance evaluation
Procedia PDF Downloads 16026853 Kinetics Analysis of Lignocellulose Hydrolysis and Glucose Consumption Using Aspergillus niger in Solid State
Authors: Akida Mulyaningtyas, Wahyudi Budi Sediawan
Abstract:
One decisive stage in bioethanol production from plant biomass is the hydrolysis of lignocellulosic materials into simple sugars such as glucose. The produced glucose is then fermented into ethanol. This stage is popularly done in biological method by using cellulase that is produced by certain fungi. As it is known, glucose is the main source of nutrition for most microorganisms. Therefore, cutting cellulose into glucose is actually an attempt of microorganism to provide nutrition for itself. So far, this phenomenon has received less attention while it is necessary to identify the quantity of sugar consumed by the microorganism. In this study, we examined the phenomenon of sugar consumption by microorganism on lignocellulosic hydrolysis. We used oil palm empty fruit bunch (OPEFB) as the source of lignocellulose and Aspergillus niger as cellulase-producing fungus. In Indonesia, OPEFB is plantation waste that is difficult to decompose in nature and causes environmental problems. First, OPEFB was pretreated with 1% of NaOH at 170 oC to destroy lignin that hindered A.niger from accessing cellulose. The hydrolysis was performed by growing A.niger on pretreated OPEFB in solid state to minimize the possibility of contamination. The produced glucose was measured every 24 hours for 9 days. We analyzed the kinetics of both reactions, i.e., hydrolysis and glucose consumption, simultaneously. The constants for both reactions were assumed to follow the Monod equation. The results showed that the reaction constant of glucose consumption (μC) was higher than of cellulose hydrolysis (μH), i.e., 11.8 g/L and 0.62 g/L for glucose consumption and hydrolysis respectively. However, in general, the reaction rate of hydrolysis is greater than of glucose consumption since the cellulose concentration as substrate in hydrolysis is much higher than glucose as substrate in the consumption reaction.Keywords: Aspergillus niger, bioethanol, hydrolysis, kinetics
Procedia PDF Downloads 16926852 Geochemistry and Petrogenesis of Anorogenic Acid Plutonic Rocks of Khanak and Devsar of Southwestern Haryana
Authors: Naresh Kumar, Radhika Sharma, A. K. Singh
Abstract:
Acid plutonic rocks from the Khanak and Devsar areas of southwestern Haryana were investigated to understand their geochemical and petrogenetic characteristics and tectonic environments. Three dominant rock types (grey, grayish green and pink granites) are the principal geochemical features of Khanak and Devsar areas which reflect the dependencies of their composition on varied geological environment during the anorogenic magmatism. These rocks are enriched in SiO₂, Na₂O+K₂O, Fe/Mg, Rb, Zr, Y, Th, U, REE (Rare Earth Elements) enriched and depleted in MgO, CaO, Sr, P, Ti, Ni, Cr, V and Eu and exhibit a clear affinity to the within-plate granites that were emplaced in an extensional tectonic environment. Chondrite-normalized REE patterns show enriched LREE (Light Rare Earth Elements), moderate to strong negative Eu anomalies and flat heavy REE and grey and grayish green is different from pink granite which is enriched by Rb, Ga, Nb, Th, U, Y and HREE (Heavy Rare Earth Elements) concentrations. The composition of parental magma of both areas corresponds to mafic source contaminated with crustal materials. Petrogenetic modelling suggest that the acid plutonic rocks might have been generated from a basaltic source by partial melting (15-25%) leaving a residue with 35% plagioclase, 25% alkali feldspar, 25% quartz, 7% orthopyroxene, 5% biotite and 3% hornblende. Granites from both areas might be formed from different sources with different degree of melting for grey, grayish green and pink granites.Keywords: A-type granite, anorogenic, Malani igneous suite, Khanak and Devsar
Procedia PDF Downloads 176