Search results for: food composition data
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 28931

Search results for: food composition data

24791 Accumulation of Heavy Metals in Safflower (Carthamus tinctorius L.)

Authors: Violina R. Angelova, Mariana N. Perifanova-Nemska, Galina P. Uzunova, Elitsa N. Kolentsova

Abstract:

Comparative research has been conducted to allow us to determine the accumulation of heavy metals (Pb, Zn and Cd) in the vegetative and reproductive organs of safflower, and to identify the possibility of its growth on soils contaminated by heavy metals and efficacy for phytoremediation. The experiment was performed on an agricultural field contaminated by the Non-Ferrous-Metal Works (MFMW) near Plovdiv, Bulgaria. The experimental plots were situated at different distances (0.1, 0.5, 2.0, and 15 km) from the source of pollution. The contents of heavy metals in plant materials (roots, stems, leaves, seeds) were determined. The quality of safflower oils (heavy metals and fatty acid composition) was also determined. The quantitative measurements were carried out with inductively-coupled plasma (ICP). Safflower is a plant that is tolerant to heavy metals and can be referred to the hyperaccumulators of lead and cadmium and the accumulators of zinc. The plant can be successfully used in the phytoremediation of heavy metal contaminated soils. The processing of safflower seeds into oil and the use of the obtained oil will greatly reduce the cost of phytoremediation.

Keywords: heavy metals, accumulation, safflower, polluted soils, phytoremediation

Procedia PDF Downloads 241
24790 Service Strategy And Innovation In The Food Service Industry: Basis For Designing A Competitive Advantage Model

Authors: Ma. Dina Datiles Jimenez

Abstract:

Service strategy and service Innovation has something to do with the success of the foodservice business. The foodservice business nowadays has become more competitive, and technology driven. This study aimed to determine and investigate the service innovation and strategies of the food service industry and the challenges during the pandemic to serve as the basis for a competitive advantage model. The study used mixed methods, including descriptive quantitative and qualitative methods. The Metro Manila foodservice managers were the target population of the study, which consisted of an estimated 1500 respondents from the selected cities. The assessment of service innovation for the following dimensions: product-related dimension; market-related dimension; process-related dimension; and organization-related dimension, when classified according to profile, was very large for age, gender, and educational attainment. When respondents are classified according to profile, the service strategy in terms of customer service strategy, after-sales service strategy, maintenance service strategy, research and development-oriented service strategy, and operational services strategy were all assessed with a very large extent of implementation. There was a significant difference in all four aspects of service innovation when classified based on age. However, for gender, only the market and process dimensions showed significant differences, while the product and organization conveyed no significant differences. Consequently, the evidence was not enough to prove that educational attainment differs from one another on the four aspects of service innovation. There was sufficient evidence to prove that the ages differ from one another in all aspects of service strategies. While gender and educational attainment showed no significant difference in the assessment of service strategies, Training on the trends in the foodservice industry during the pandemic is offered; technical maintenance is evident; the company allotted budget for outsourcing training; the quality control system; and online customer feedback were revealed as major indicators for service strategy. Fear of viruses, limited customers, a minimal work force, and low revenues were identified as challenges faced by the foodservice industry.

Keywords: foodservice industry, service innovation, service strategy, competitive advantage, sustainability, technology

Procedia PDF Downloads 58
24789 Design and Development of Data Visualization in 2D and 3D Space Using Front-End Technologies

Authors: Sourabh Yaduvanshi, Varsha Namdeo, Namrata Yaduvanshi

Abstract:

This study delves into the design and development intricacies of crafting detailed 2D bar charts via d3.js, recognizing its limitations in generating 3D visuals within the DOM. The study combines three.js with d3.js, facilitating a smooth evolution from 2D to immersive 3D representations. This fusion epitomizes the synergy between front-end technologies, expanding horizons in data visualization. Beyond technical expertise, it symbolizes a creative convergence, pushing boundaries in visual representation. The abstract illuminates methodologies, unraveling the intricate integration of this fusion and guiding enthusiasts. It narrates a compelling story of transcending 2D constraints, propelling data visualization into captivating three-dimensional realms, and igniting creativity in front-end visualization endeavors.

Keywords: design, development, front-end technologies, visualization

Procedia PDF Downloads 59
24788 The Classification Performance in Parametric and Nonparametric Discriminant Analysis for a Class- Unbalanced Data of Diabetes Risk Groups

Authors: Lily Ingsrisawang, Tasanee Nacharoen

Abstract:

Introduction: The problems of unbalanced data sets generally appear in real world applications. Due to unequal class distribution, many research papers found that the performance of existing classifier tends to be biased towards the majority class. The k -nearest neighbors’ nonparametric discriminant analysis is one method that was proposed for classifying unbalanced classes with good performance. Hence, the methods of discriminant analysis are of interest to us in investigating misclassification error rates for class-imbalanced data of three diabetes risk groups. Objective: The purpose of this study was to compare the classification performance between parametric discriminant analysis and nonparametric discriminant analysis in a three-class classification application of class-imbalanced data of diabetes risk groups. Methods: Data from a healthy project for 599 staffs in a government hospital in Bangkok were obtained for the classification problem. The staffs were diagnosed into one of three diabetes risk groups: non-risk (90%), risk (5%), and diabetic (5%). The original data along with the variables; diabetes risk group, age, gender, cholesterol, and BMI was analyzed and bootstrapped up to 50 and 100 samples, 599 observations per sample, for additional estimation of misclassification error rate. Each data set was explored for the departure of multivariate normality and the equality of covariance matrices of the three risk groups. Both the original data and the bootstrap samples show non-normality and unequal covariance matrices. The parametric linear discriminant function, quadratic discriminant function, and the nonparametric k-nearest neighbors’ discriminant function were performed over 50 and 100 bootstrap samples and applied to the original data. In finding the optimal classification rule, the choices of prior probabilities were set up for both equal proportions (0.33: 0.33: 0.33) and unequal proportions with three choices of (0.90:0.05:0.05), (0.80: 0.10: 0.10) or (0.70, 0.15, 0.15). Results: The results from 50 and 100 bootstrap samples indicated that the k-nearest neighbors approach when k = 3 or k = 4 and the prior probabilities of {non-risk:risk:diabetic} as {0.90:0.05:0.05} or {0.80:0.10:0.10} gave the smallest error rate of misclassification. Conclusion: The k-nearest neighbors approach would be suggested for classifying a three-class-imbalanced data of diabetes risk groups.

Keywords: error rate, bootstrap, diabetes risk groups, k-nearest neighbors

Procedia PDF Downloads 422
24787 BFDD-S: Big Data Framework to Detect and Mitigate DDoS Attack in SDN Network

Authors: Amirreza Fazely Hamedani, Muzzamil Aziz, Philipp Wieder, Ramin Yahyapour

Abstract:

Software-defined networking in recent years came into the sight of so many network designers as a successor to the traditional networking. Unlike traditional networks where control and data planes engage together within a single device in the network infrastructure such as switches and routers, the two planes are kept separated in software-defined networks (SDNs). All critical decisions about packet routing are made on the network controller, and the data level devices forward the packets based on these decisions. This type of network is vulnerable to DDoS attacks, degrading the overall functioning and performance of the network by continuously injecting the fake flows into it. This increases substantial burden on the controller side, and the result ultimately leads to the inaccessibility of the controller and the lack of network service to the legitimate users. Thus, the protection of this novel network architecture against denial of service attacks is essential. In the world of cybersecurity, attacks and new threats emerge every day. It is essential to have tools capable of managing and analyzing all this new information to detect possible attacks in real-time. These tools should provide a comprehensive solution to automatically detect, predict and prevent abnormalities in the network. Big data encompasses a wide range of studies, but it mainly refers to the massive amounts of structured and unstructured data that organizations deal with on a regular basis. On the other hand, it regards not only the volume of the data; but also that how data-driven information can be used to enhance decision-making processes, security, and the overall efficiency of a business. This paper presents an intelligent big data framework as a solution to handle illegitimate traffic burden on the SDN network created by the numerous DDoS attacks. The framework entails an efficient defence and monitoring mechanism against DDoS attacks by employing the state of the art machine learning techniques.

Keywords: apache spark, apache kafka, big data, DDoS attack, machine learning, SDN network

Procedia PDF Downloads 156
24786 The Reconstruction of Paleoenvironment Aptian Sediments of the Massive Serdj, North Central Tunisia

Authors: H. Khaled, F. Chaabani, F. Boulvain

Abstract:

This paper focuses on the studied of Aptian series that crops out at the Jebel Serdj in the north central Tunisia. The study series is about 590 meters thick and it is consisting of limestones, marly limestones associated with some levels of siltstones and marls. Two sections are studied in detail regarding lithology, microfacies, magnetic susceptibility and mineralogical composition to provide new insights into the paleoenvironmental evolution and paleoclimatological implications during this period. The following facies associations representing different ramp palaeoenvironments have been identified: mudstone–wackestone outer ramp facies; skeletal grainstone- packstone mid-ramp facies, packstone-grainstone inner-ramp facies which include a variety of organisms such as rudists and ooids and mudstone–wackestone coastal facies rich with miliolidea and orbitolines. The magnetic susceptibility (Xᵢₙ) of all samples was compared with the lithological and microfacies variation. We show that high values of magnetic susceptibility are correlated with the distal facies.

Keywords: Aptian, Serdj Formation, geochemical, mineralogy

Procedia PDF Downloads 130
24785 Microbial Pathogens Associated with Banded Sugar Ants (Camponotus consobrinus) in Calabar, Nigeria

Authors: Ofonime Ogba, Augustine Akpan

Abstract:

Objectives and Goals: The study was aimed at determining pathogenic microbial carriage on the external body parts of Camponotus consobrinus which is also known as the banded sugar ant because of its liking for sugar and sweet food. The level of pathogenic microbial carriage of Camponotus consobrinus in association to the environment in which they have been collected is not known. Methods: The ants were purposively collected from four locations including the kitchens, bedroom of various homes, food shops, and bakeries. The sample collection took place within the hours of 6:30 pm to 11:00 pm. The ants were trapped in transparent plastic containers of which sugar, pineapple peels, sugar cane and soft drinks were used as bait. The ants were removed with a sterile spatula and put in 10mls of peptone water in sterile universal bottles. The containers were vigorously shaken to wash the external surface of the ant. It was left overnight and transported to the Microbiology Laboratory, University of Calabar Teaching Hospital for analysis. The overnight peptone broths were inoculated on Chocolate agar, Blood agar, Cystine Lactose Electrolyte-Deficient agar (CLED) and Sabouraud dextrose agar. Incubation was done aerobically and in a carbon dioxide jar for 24 to 48 hours at 37°C. Isolates were identified based on colonial characteristics, Gram staining, and biochemical tests. Results: Out of the 250 Camponotus consobrinus caught for the study, 90(36.0%) were caught in the kitchen, 75(30.0%) in the bedrooms 40(16.0%) in the bakery while 45(18.0%) were caught in the shops. A total of 82.0% prevalence of different microbial isolates was associated with the ants. The kitchen had the highest number of isolates 75(36.6%) followed by the bedroom 55(26.8%) while the bakery recorded the lowest number of isolates 35(17.1%). The profile of micro-organisms associated with Camponotus consobrinus was Escherichia coli 73(30.0%), Morganella morganii 45(18.0%), Candida species 25(10.0%), Serratia marcescens 10(4.0%) and Citrobacter freundii 10(4.0%). Conclusion: Most of the Camponotus consobrinus examined in the four locations harboured potential pathogens. The presence of ants in homes and shops can facilitate the propagation and spread of pathogenic microorganisms. Therefore, the development of basic preventive measures and the control of ants must be taken seriously.

Keywords: Camponotus consobrinus, potential pathogens, microbial isolates, spread

Procedia PDF Downloads 147
24784 On the Estimation of Crime Rate in the Southwest of Nigeria: Principal Component Analysis Approach

Authors: Kayode Balogun, Femi Ayoola

Abstract:

Crime is at alarming rate in this part of world and there are many factors that are contributing to this antisocietal behaviour both among the youths and old. In this work, principal component analysis (PCA) was used as a tool to reduce the dimensionality and to really know those variables that were crime prone in the study region. Data were collected on twenty-eight crime variables from National Bureau of Statistics (NBS) databank for a period of fifteen years, while retaining as much of the information as possible. We use PCA in this study to know the number of major variables and contributors to the crime in the Southwest Nigeria. The results of our analysis revealed that there were eight principal variables have been retained using the Scree plot and Loading plot which implies an eight-equation solution will be appropriate for the data. The eight components explained 93.81% of the total variation in the data set. We also found that the highest and commonly committed crimes in the Southwestern Nigeria were: Assault, Grievous Harm and Wounding, theft/stealing, burglary, house breaking, false pretence, unlawful arms possession and breach of public peace.

Keywords: crime rates, data, Southwest Nigeria, principal component analysis, variables

Procedia PDF Downloads 424
24783 On-Line Data-Driven Multivariate Statistical Prediction Approach to Production Monitoring

Authors: Hyun-Woo Cho

Abstract:

Detection of incipient abnormal events in production processes is important to improve safety and reliability of manufacturing operations and reduce losses caused by failures. The construction of calibration models for predicting faulty conditions is quite essential in making decisions on when to perform preventive maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of process measurement data. The calibration model is used to predict faulty conditions from historical reference data. This approach utilizes variable selection techniques, and the predictive performance of several prediction methods are evaluated using real data. The results shows that the calibration model based on supervised probabilistic model yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.

Keywords: calibration model, monitoring, quality improvement, feature selection

Procedia PDF Downloads 341
24782 Investigation of Mechanical Properties and Wear Behavior of Hot Roller Grades

Authors: Majid Mokhtari, Masoud Bahrami Alamdarlo, Babak Nazari, Hossein Zakerinya, Mehdi Salehi

Abstract:

In this study, microstructure, macro, and microhardness of phases for three grades of cast iron rolls with modified chemical composition using a light microscope (OM) and electron microscopy (SEM) were investigated. The grades were chosen from Chodan Sazan Manufacturing Co. (CSROLL) productions for finishing stands of hot strip mills. The percentage of residual austenite was determined with a ferrite scope magnetic device. Thermal susceptibility testing was also measured. The results show the best oxidation resistance at high temperatures is graphitic high chromium white cast iron alloy. In order to evaluate the final properties of these grades in rolling lines, the results of the Pin on Disk abrasion test showed the superiority of the abrasive behavior of the white chromium graphite cast iron alloy grade sample at the same hardness compared to conventional alloy grades and the enhanced grades.

Keywords: hot roller, wear, behavior, microstructure

Procedia PDF Downloads 218
24781 Does Supervisory Board Composition Influence Sustainability Reporting Quality?

Authors: Patrick Velte

Abstract:

Sustainability reporting has become a central element of modern corporate governance practice. This paper is the first to recognize supervisory board independence, sustainable expertise and gender diversity in two European two tier countries and their impact on sustainability reporting quality. For a sample of 188 German and Austrian companies which are listed at the Prime Standard of the Frankfurt and Vienna Stock Exchange for the business years 2012-2013, descriptive findings show that CSR reporting quality is still low in both countries. Furthermore, multiple regressions state that independent and female members in the supervisory board do have a positive impact on CSR reporting quality in Germany and Austria. However, the existence of sustainable experts in the supervisory board both in Germany and Austria shows a positive but insignificant impact. Our findings suggest that the current European corporate governance regulations can be a useful instrument to increase the quality of modern CSR reporting for the stakeholders.

Keywords: sustainability reporting, corporate governance, gender diversity, board independence

Procedia PDF Downloads 383
24780 Multilevel Gray Scale Image Encryption through 2D Cellular Automata

Authors: Rupali Bhardwaj

Abstract:

Cryptography is the science of using mathematics to encrypt and decrypt data; the data are converted into some other gibberish form, and then the encrypted data are transmitted. The primary purpose of this paper is to provide two levels of security through a two-step process, rather than transmitted the message bits directly, first encrypted it using 2D cellular automata and then scrambled with Arnold Cat Map transformation; it provides an additional layer of protection and reduces the chance of the transmitted message being detected. A comparative analysis on effectiveness of scrambling technique is provided by scrambling degree measurement parameters i.e. Gray Difference Degree (GDD) and Correlation Coefficient.

Keywords: scrambling, cellular automata, Arnold cat map, game of life, gray difference degree, correlation coefficient

Procedia PDF Downloads 358
24779 Survey Based Data Security Evaluation in Pakistan Financial Institutions against Malicious Attacks

Authors: Naveed Ghani, Samreen Javed

Abstract:

In today’s heterogeneous network environment, there is a growing demand for distrust clients to jointly execute secure network to prevent from malicious attacks as the defining task of propagating malicious code is to locate new targets to attack. Residual risk is always there no matter what solutions are implemented or whet so ever security methodology or standards being adapted. Security is the first and crucial phase in the field of Computer Science. The main aim of the Computer Security is gathering of information with secure network. No one need wonder what all that malware is trying to do: It's trying to steal money through data theft, bank transfers, stolen passwords, or swiped identities. From there, with the help of our survey we learn about the importance of white listing, antimalware programs, security patches, log files, honey pots, and more used in banks for financial data protection but there’s also a need of implementing the IPV6 tunneling with Crypto data transformation according to the requirements of new technology to prevent the organization from new Malware attacks and crafting of its own messages and sending them to the target. In this paper the writer has given the idea of implementing IPV6 Tunneling Secessions on private data transmission from financial organizations whose secrecy needed to be safeguarded.

Keywords: network worms, malware infection propagating malicious code, virus, security, VPN

Procedia PDF Downloads 344
24778 Interactive IoT-Blockchain System for Big Data Processing

Authors: Abdallah Al-ZoubI, Mamoun Dmour

Abstract:

The spectrum of IoT devices is becoming widely diversified, entering almost all possible fields and finding applications in industry, health, finance, logistics, education, to name a few. The IoT active endpoint sensors and devices exceeded the 12 billion mark in 2021 and are expected to reach 27 billion in 2025, with over $34 billion in total market value. This sheer rise in numbers and use of IoT devices bring with it considerable concerns regarding data storage, analysis, manipulation and protection. IoT Blockchain-based systems have recently been proposed as a decentralized solution for large-scale data storage and protection. COVID-19 has actually accelerated the desire to utilize IoT devices as it impacted both demand and supply and significantly affected several regions due to logistic reasons such as supply chain interruptions, shortage of shipping containers and port congestion. An IoT-blockchain system is proposed to handle big data generated by a distributed network of sensors and controllers in an interactive manner. The system is designed using the Ethereum platform, which utilizes smart contracts, programmed in solidity to execute and manage data generated by IoT sensors and devices. such as Raspberry Pi 4, Rasbpian, and add-on hardware security modules. The proposed system will run a number of applications hosted by a local machine used to validate transactions. It then sends data to the rest of the network through InterPlanetary File System (IPFS) and Ethereum Swarm, forming a closed IoT ecosystem run by blockchain where a number of distributed IoT devices can communicate and interact, thus forming a closed, controlled environment. A prototype has been deployed with three IoT handling units distributed over a wide geographical space in order to examine its feasibility, performance and costs. Initial results indicated that big IoT data retrieval and storage is feasible and interactivity is possible, provided that certain conditions of cost, speed and thorough put are met.

Keywords: IoT devices, blockchain, Ethereum, big data

Procedia PDF Downloads 132
24777 The Golden Bridge for Better Farmers Life

Authors: Giga Rahmah An-Nafisah, Lailatus Syifa Kamilah

Abstract:

Agriculture today, especially in Indonesia have globally improved. Since the election of the new president, who in the program of work priority the food self-sufficiency. Many ways and attempts have been planned carefully. All this is done to maximize agricultural production for the future. But if we look from another side, there is something missing. Yes! Improvement of life safety of the farmers, useless we fix all agricultural processing systems to maximize agricultural output, but the Hero of agriculture itself it does not change towards a better life. Yes, broker or middleman system agriculture results. Broker system or middleman this is the real problem facing farmers for their welfare. How come? As much as agriculture result, but if farmers were sell into middlemen with very low prices, then there will be no progress for their welfare. Broker system who do the actual middlemen should not happen in the current agricultural system, because the agriculture condition currently being concern, they would still be able to reap a profit as much as possible, no matter how miserable farmers manage the farm and currently face import competition this cannot be avoided anymore. This phenomenon is already visible plain sight all, who see it. Why? Because farmers those who fell victim cannot do anything to change this system. It is true, if only these middlemen who want to receive it for the sale of agricultural products, or arguably the only system that is the bridge realtor economic life of the farmers. The problem is that we should strive for the welfare of the heroes of our food. A golden bridge that could save them that, are the government. Why? Because the government can more easily with the powers to stop this broker system compared to other parties. The government supposed to be a bridge connecting the farmers with consumers or the people themselves. Yes, with improved broker system becomes: buy agricultural produce with highest prices to farmers and selling of agricultural products with lowest price to the consumer or the people themselves. And then the next question about the fate of middlemen? The system indirectly realtor is like system corruption. Why? Because the definition of corruption is an activity that is detrimental to the victim without being noticed by anyone continue to enrich himself and his victim's life miserable. Government may transfer performance of the middlemen into the idea of a new bridge that is done by the government itself. The government could lift them into this new bridge system employs them to remain a distributor of agricultural products themselves, but under the new policy made by the government to keep improving the welfare of farmers. This idea is made is not going to have much effect would improve the welfare of farmers, but most/least this idea will bring around many people for helping conscience farmers to the government, through the daily chatter, as well as celebrity gossip can quickly know too many people.

Keywords: broker system, farmers live, government, agricultural economics

Procedia PDF Downloads 276
24776 Keynote Talk: The Role of Internet of Things in the Smart Cities Power System

Authors: Abdul-Rahman Al-Ali

Abstract:

As the number of mobile devices is growing exponentially, it is estimated to connect about 50 million devices to the Internet by the year 2020. At the end of this decade, it is expected that an average of eight connected devices per person worldwide. The 50 billion devices are not mobile phones and data browsing gadgets only, but machine-to-machine and man-to-machine devices. With such growing numbers of devices the Internet of Things (I.o.T) concept is one of the emerging technologies as of recently. Within the smart grid technologies, smart home appliances, Intelligent Electronic Devices (IED) and Distributed Energy Resources (DER) are major I.o.T objects that can be addressable using the IPV6. These objects are called the smart grid internet of things (SG-I.o.T). The SG-I.o.T generates big data that requires high-speed computing infrastructure, widespread computer networks, big data storage, software, and platforms services. A company’s utility control and data centers cannot handle such a large number of devices, high-speed processing, and massive data storage. Building large data center’s infrastructure takes a long time, it also requires widespread communication networks and huge capital investment. To maintain and upgrade control and data centers’ infrastructure and communication networks as well as updating and renewing software licenses which collectively, requires additional cost. This can be overcome by utilizing the emerging computing paradigms such as cloud computing. This can be used as a smart grid enabler to replace the legacy of utilities data centers. The talk will highlight the role of I.o.T, cloud computing services and their development models within the smart grid technologies.

Keywords: intelligent electronic devices (IED), distributed energy resources (DER), internet, smart home appliances

Procedia PDF Downloads 308
24775 Evaluation of Agricultural Drought Impact in the Crop Productivity of East Gojjam Zone

Authors: Walelgn Dilnesa Cherie, Fasikaw Atanaw Zimale, Bekalu W. Asres

Abstract:

The most catastrophic condition for agricultural production is a drought event, which is also one of the most hydro-metrological-related hazards. According to the combined susceptibility of plants to meteorological and hydrological conditions, agricultural drought is defined as the magnitude, severity, and duration of a drought that affects crop production. The accurate and timely assessment of agricultural drought can lead to the development of risk management strategies, appropriate proactive mechanisms for the protection of farmers, and the improvement of food security. The evaluation of agricultural drought in the East Gojjam zone was the primary subject of this study. To identify the agricultural drought, soil moisture anomalies, soil moisture deficit indices, and Normalized Difference Vegetation Indices (NDVI) are used. The measured welting point, field capacity, and soil moisture were utilized to validate the soil water deficit indices computed from the satellite data. The soil moisture and soil water deficit indices in 2013 in all woredas were minimum; this makes vegetation stress also in all woredas. The soil moisture content decreased in 2013/2014/2019, and 2021 in Dejen, 2014, and 2019 in Awobel Woreda. The max/ min values of NDVI in 2013 are minimum; it dominantly shows vegetation stress and an observed agricultural drought that happened in all woredas. The validation process of satellite and in-situ soil moisture and soil water deficit indices shows a good agreement with a value of R²=0.87 and 0.56, respectively. The study area becomes drought detected region, so government officials, policymakers, and environmentalists pay attention to the protection of drought effects.

Keywords: NDVI, agricultural drought, SWDI, soil moisture

Procedia PDF Downloads 68
24774 Statistical Analysis of Interferon-γ for the Effectiveness of an Anti-Tuberculous Treatment

Authors: Shishen Xie, Yingda L. Xie

Abstract:

Tuberculosis (TB) is a potentially serious infectious disease that remains a health concern. The Interferon Gamma Release Assay (IGRA) is a blood test to find out if an individual is tuberculous positive or negative. This study applies statistical analysis to the clinical data of interferon-gamma levels of seventy-three subjects who diagnosed pulmonary TB in an anti-tuberculous treatment. Data analysis is performed to determine if there is a significant decline in interferon-gamma levels for the subjects during a period of six months, and to infer if the anti-tuberculous treatment is effective.

Keywords: data analysis, interferon gamma release assay, statistical methods, tuberculosis infection

Procedia PDF Downloads 292
24773 Short Text Classification Using Part of Speech Feature to Analyze Students' Feedback of Assessment Components

Authors: Zainab Mutlaq Ibrahim, Mohamed Bader-El-Den, Mihaela Cocea

Abstract:

Students' textual feedback can hold unique patterns and useful information about learning process, it can hold information about advantages and disadvantages of teaching methods, assessment components, facilities, and other aspects of teaching. The results of analysing such a feedback can form a key point for institutions’ decision makers to advance and update their systems accordingly. This paper proposes a data mining framework for analysing end of unit general textual feedback using part of speech feature (PoS) with four machine learning algorithms: support vector machines, decision tree, random forest, and naive bays. The proposed framework has two tasks: first, to use the above algorithms to build an optimal model that automatically classifies the whole data set into two subsets, one subset is tailored to assessment practices (assessment related), and the other one is the non-assessment related data. Second task to use the same algorithms to build an optimal model for whole data set, and the new data subsets to automatically detect their sentiment. The significance of this paper is to compare the performance of the above four algorithms using part of speech feature to the performance of the same algorithms using n-grams feature. The paper follows Knowledge Discovery and Data Mining (KDDM) framework to construct the classification and sentiment analysis models, which is understanding the assessment domain, cleaning and pre-processing the data set, selecting and running the data mining algorithm, interpreting mined patterns, and consolidating the discovered knowledge. The results of this paper experiments show that both models which used both features performed very well regarding first task. But regarding the second task, models that used part of speech feature has underperformed in comparison with models that used unigrams and bigrams.

Keywords: assessment, part of speech, sentiment analysis, student feedback

Procedia PDF Downloads 126
24772 Fast Fourier Transform-Based Steganalysis of Covert Communications over Streaming Media

Authors: Jinghui Peng, Shanyu Tang, Jia Li

Abstract:

Steganalysis seeks to detect the presence of secret data embedded in cover objects, and there is an imminent demand to detect hidden messages in streaming media. This paper shows how a steganalysis algorithm based on Fast Fourier Transform (FFT) can be used to detect the existence of secret data embedded in streaming media. The proposed algorithm uses machine parameter characteristics and a network sniffer to determine whether the Internet traffic contains streaming channels. The detected streaming data is then transferred from the time domain to the frequency domain through FFT. The distributions of power spectra in the frequency domain between original VoIP streams and stego VoIP streams are compared in turn using t-test, achieving the p-value of 7.5686E-176 which is below the threshold. The results indicate that the proposed FFT-based steganalysis algorithm is effective in detecting the secret data embedded in VoIP streaming media.

Keywords: steganalysis, security, Fast Fourier Transform, streaming media

Procedia PDF Downloads 128
24771 The Effect of Contrast on Approach Distances of Carcharhinus perezi

Authors: Elizabeth Farquhar, Erich Ritter

Abstract:

Studying shark's interaction with humans and their behavioral responses will have enormous implications for other fields of marine biology and oceanography. The health of sharks has direct impacts on the stability of human society with a reported 3.5 billion people depending on the ocean for food and/or livelihood. Discovering how sharks behave and interact with people, will have enormous implications for future studies, along with the development of more effective ways to reduce negative shark/human interactions. This specific study investigates the effects of contrasting ponchos worn by divers on the approach distances of Carcharhinus perezi. Data was collected over a two week period at a test site off the shore of Eleuthera Island in the Bahamas, with a depth of approximately 55 feet during mid-August. Sixty-minute dive trials were conducted and videoed from above with 5-meter radius markers on the ocean floor surrounding the two divers, kneeling back-to-back. Five poncho colors were worn by the two divers (black, navy blue, dark green, yellow and orange), rotating the color permutations randomly to test the distance a shark will approach each color. Results indicate significantly closer approach patterns when divers were wearing orange ponchos, and the combination of orange with black and blue ponchos were found to be statistically significant. These results are relevant to understanding how sharks perceive contrast and dive equipment in the marine environment, which could have the potential to prevent negative shark/human interactions.

Keywords: shark behavior, animal behavior, marine biology, conservation

Procedia PDF Downloads 126
24770 Privacy-Preserving Model for Social Network Sites to Prevent Unwanted Information Diffusion

Authors: Sanaz Kavianpour, Zuraini Ismail, Bharanidharan Shanmugam

Abstract:

Social Network Sites (SNSs) can be served as an invaluable platform to transfer the information across a large number of individuals. A substantial component of communicating and managing information is to identify which individual will influence others in propagating information and also whether dissemination of information in the absence of social signals about that information will be occurred or not. Classifying the final audience of social data is difficult as controlling the social contexts which transfers among individuals are not completely possible. Hence, undesirable information diffusion to an unauthorized individual on SNSs can threaten individuals’ privacy. This paper highlights the information diffusion in SNSs and moreover it emphasizes the most significant privacy issues to individuals of SNSs. The goal of this paper is to propose a privacy-preserving model that has urgent regards with individuals’ data in order to control availability of data and improve privacy by providing access to the data for an appropriate third parties without compromising the advantages of information sharing through SNSs.

Keywords: anonymization algorithm, classification algorithm, information diffusion, privacy, social network sites

Procedia PDF Downloads 308
24769 Analysis of Wall Deformation of the Arterial Plaque Models: Effects of Viscoelasticity

Authors: Eun Kyung Kim, Kyehan Rhee

Abstract:

Viscoelastic wall properties of the arterial plaques change as the disease progresses, and estimation of wall viscoelasticity can provide a valuable assessment tool for plaque rupture prediction. Cross section of the stenotic coronary artery was modeled based on the IVUS image, and the finite element analysis was performed to get wall deformation under pulsatile pressure. The effects of viscoelastic parameters of the plaque on luminal diameter variations were explored. The result showed that decrease of viscous effect reduced the phase angle between the pressure and displacement waveforms, and phase angle was dependent on the viscoelastic properties of the wall. Because viscous effect of tissue components could be identified using the phase angle difference, wall deformation waveform analysis may be applied to predict plaque wall composition change and vascular wall disease progression.

Keywords: atherosclerotic plaque, diameter variation, finite element method, viscoelasticity

Procedia PDF Downloads 206
24768 Application Difference between Cox and Logistic Regression Models

Authors: Idrissa Kayijuka

Abstract:

The logistic regression and Cox regression models (proportional hazard model) at present are being employed in the analysis of prospective epidemiologic research looking into risk factors in their application on chronic diseases. However, a theoretical relationship between the two models has been studied. By definition, Cox regression model also called Cox proportional hazard model is a procedure that is used in modeling data regarding time leading up to an event where censored cases exist. Whereas the Logistic regression model is mostly applicable in cases where the independent variables consist of numerical as well as nominal values while the resultant variable is binary (dichotomous). Arguments and findings of many researchers focused on the overview of Cox and Logistic regression models and their different applications in different areas. In this work, the analysis is done on secondary data whose source is SPSS exercise data on BREAST CANCER with a sample size of 1121 women where the main objective is to show the application difference between Cox regression model and logistic regression model based on factors that cause women to die due to breast cancer. Thus we did some analysis manually i.e. on lymph nodes status, and SPSS software helped to analyze the mentioned data. This study found out that there is an application difference between Cox and Logistic regression models which is Cox regression model is used if one wishes to analyze data which also include the follow-up time whereas Logistic regression model analyzes data without follow-up-time. Also, they have measurements of association which is different: hazard ratio and odds ratio for Cox and logistic regression models respectively. A similarity between the two models is that they are both applicable in the prediction of the upshot of a categorical variable i.e. a variable that can accommodate only a restricted number of categories. In conclusion, Cox regression model differs from logistic regression by assessing a rate instead of proportion. The two models can be applied in many other researches since they are suitable methods for analyzing data but the more recommended is the Cox, regression model.

Keywords: logistic regression model, Cox regression model, survival analysis, hazard ratio

Procedia PDF Downloads 438
24767 Evaluating Aquaculture Farmers Responses to Climate Change and Sustainable Practices in Kenya

Authors: Olalekan Adekola, Margaret Gatonye, Paul Orina

Abstract:

The growing demand for farmed fish by underdeveloped and developing countries as a means of contributing positively towards eradication of hunger, food insecurity, and malnutrition for their fast growing populations has implications to the environment. Likewise, climate change poses both an immediate and future threat to local fish production with capture fisheries already experiencing a global decline. This not only raises fundamental questions concerning how aquaculture practices affect the environment, but also how ready are aquaculture farmers to adapt to climate related hazards. This paper assesses existing aquaculture practices and approaches to adapting to climate hazards in Kenya, where aquaculture has grown rapidly since the year 2009. The growth has seen rise in aquaculture set ups mainly along rivers and streams, importation of seed and feed and intensification with possible environmental implications. The aquaculture value chain in the context of climate change and their implication for practice is further investigated, and the strategies necessary for an improved implementation of resilient aquaculture system in Kenya is examined. Data for the study are collected from interviews, questionnaires, two workshops and document analysis. Despite acclaimed nutritional benefit of fish consumption in Kenya, poor management of effluents enriched with nitrogen, phosphorus, organic matter, and suspended solids has implications not just on the ecosystem, goods, and services, but is also potential source of resource-use conflicts especially in downstream communities and operators in the livestock, horticulture, and industrial sectors. The study concluded that aquaculture focuses on future orientation, climate resilient infrastructure, appropriate site selection and invest on biosafety as the key sustainable strategies against climate hazards.

Keywords: aquaculture, resilience, environment, strategies, Kenya

Procedia PDF Downloads 149
24766 Text Mining of Twitter Data Using a Latent Dirichlet Allocation Topic Model and Sentiment Analysis

Authors: Sidi Yang, Haiyi Zhang

Abstract:

Twitter is a microblogging platform, where millions of users daily share their attitudes, views, and opinions. Using a probabilistic Latent Dirichlet Allocation (LDA) topic model to discern the most popular topics in the Twitter data is an effective way to analyze a large set of tweets to find a set of topics in a computationally efficient manner. Sentiment analysis provides an effective method to show the emotions and sentiments found in each tweet and an efficient way to summarize the results in a manner that is clearly understood. The primary goal of this paper is to explore text mining, extract and analyze useful information from unstructured text using two approaches: LDA topic modelling and sentiment analysis by examining Twitter plain text data in English. These two methods allow people to dig data more effectively and efficiently. LDA topic model and sentiment analysis can also be applied to provide insight views in business and scientific fields.

Keywords: text mining, Twitter, topic model, sentiment analysis

Procedia PDF Downloads 161
24765 Navigating Government Finance Statistics: Effortless Retrieval and Comparative Analysis through Data Science and Machine Learning

Authors: Kwaku Damoah

Abstract:

This paper presents a methodology and software application (App) designed to empower users in accessing, retrieving, and comparatively exploring data within the hierarchical network framework of the Government Finance Statistics (GFS) system. It explores the ease of navigating the GFS system and identifies the gaps filled by the new methodology and App. The GFS, embodies a complex Hierarchical Network Classification (HNC) structure, encapsulating institutional units, revenues, expenses, assets, liabilities, and economic activities. Navigating this structure demands specialized knowledge, experience, and skill, posing a significant challenge for effective analytics and fiscal policy decision-making. Many professionals encounter difficulties deciphering these classifications, hindering confident utilization of the system. This accessibility barrier obstructs a vast number of professionals, students, policymakers, and the public from leveraging the abundant data and information within the GFS. Leveraging R programming language, Data Science Analytics and Machine Learning, an efficient methodology enabling users to access, navigate, and conduct exploratory comparisons was developed. The machine learning Fiscal Analytics App (FLOWZZ) democratizes access to advanced analytics through its user-friendly interface, breaking down expertise barriers.

Keywords: data science, data wrangling, drilldown analytics, government finance statistics, hierarchical network classification, machine learning, web application.

Procedia PDF Downloads 49
24764 Revealing Insights into the Mechanisms of Biofilm Adhesion on Surfaces in Crude Oil Environments

Authors: Hadjer Didouh, Mohammed Hadj Meliani, Izzaddine Sameut Bouhaik

Abstract:

This study employs a multidisciplinary approach to investigate the intricate processes governing biofilm-surface interactions. Results indicate that surface properties significantly influence initial microbial attachment, with materials characterized by increased roughness and hydrophobicity promoting enhanced biofilm adhesion. Moreover, the chemical composition of materials plays a crucial role in impacting the development of biofilms. Environmental factors, such as temperature fluctuations and nutrient availability, were identified as key determinants affecting biofilm formation dynamics. Advanced imaging techniques revealed complex three-dimensional biofilm structures, emphasizing microbial communication and cooperation within these networks. These findings offer practical implications for industries operating in crude oil environments, guiding the selection and design of materials to mitigate biofilm-related challenges and enhance operational efficiency in such settings.

Keywords: biofilm adhesion, surface properties, crude oil environments, microbial interactions, multidisciplinary investigation

Procedia PDF Downloads 56
24763 Evaluation of Azo Dye Toxicity Using Some Haematological and Histopathological Alterations in Fish Catla Catla

Authors: Jagruti Barot

Abstract:

The textile industry plays a major role in the economy of India and on the other side of the coin it is the major source for water pollution. As azo dyes is the largest dye class they are extensively used in many fields such as textile industry, leather tanning industry, paper production, food, colour photography, pharmaceuticals and medicine, cosmetic, hair colourings, wood staining, agricultural, biological and chemical research etc. In addition to these, they can have acute and/or chronic effects on organisms depending on their concentration and length of exposure when they discharged as effluent in the environment. The aim of this study was to assess the genotoxic and histotoxic potentials of environmentally relevant concentrations of RR 120 on Catla catla, important edible freshwater fingerlings. For this, healthy Catla catla fingerlings were procured from the Government Fish Farm and acclimatized in 100 L capacity and continuously aerated glass aquarium in laboratory for 15 days. According to APHA some physic-chemical parameters were measured and maintained such as temperature, pH, dissolve oxygen, alkalinity, total hardness. Water along with excreta had been changed every 24 hrs. All fingerlings were fed artificial food palates once a day @ body weight. After 15 days fingerlings were grouped in 5 (10 in each) and exposed to various concentrations of RR 120 (Control, 10, 20, 30 and 40 mg/L) and samples (peripheral blood and gills, kidney) were collected and analyzed at 96 hrs. of interval. All results were compared with the control. Micronuclei (MN), nuclear buds (NB), fragmented-apoptotic (FA) and bi-nucleated (BN) cells in blood cells and in tissues (gills and kidney cells) were observed. Prominent histopathological alterations were noticed in gills such as aneurism, hyperplasia, degenerated central axis, lifting of gill epithelium, curved secondary gill lamellae etc. Similarly kidney showed some detrimental changes like shrunken glomeruli with increased periglomerular space, degenerated renal tubules etc. Both haematological and histopathological changes clearly reveal the toxic potential of RR 120. This work concludes that water pollution assessment can be done by these two biomarkers which provide baseline to the further chromosomal or molecular work.

Keywords: micronuclei, genotoxicity, RR 120, Catla catla

Procedia PDF Downloads 196
24762 Effect on Occupational Health Safety and Environment at Work from Metal Handicraft Using Rattanakosin Local Wisdom

Authors: Witthaya Mekhum, Waleerak Sittisom

Abstract:

This research investigated the effect on occupational health safety and environment at work from metal handicraft using Rattanakosin local wisdom focusing on pollution, accidents, and injuries from work. The sample group in this study included 48 metal handicraft workers in 5 communities by using questionnaires and interview to collect data. The evaluation form TISI 18001 was used to analyze job safety analysis (JSA). The results showed that risk at work reduced after applying the developed model. Banbu Community produces alloy bowl rubbed with stone. The high risk process is melting and hitting process. Before the application, the work risk was 82.71%. After the application of the developed model, the work risk was reduced to 50.61%. Banbart Community produces monk’s food bowl. The high risk process is blow pipe welding. Before the application, the work risk was 93.59%. After the application of the developed model, the work risk was reduced to 48.14%. Bannoen Community produces circle gong. The high risk process is milling process. Before the application, the work risk was 85.18%. After the application of the developed model, the work risk was reduced to 46.91%. Teethong Community produces gold leaf. The high risk process is hitting and spreading process. Before the application, the work risk was 86.42%. After the application of the developed model, the work risk was reduced to 64.19%. Ban Changthong Community produces gold ornament. The high risk process is gold melting process. Before the application, the work risk was 67.90%. After the application of the developed model, the work risk was reduced to 37.03%. It can be concluded that with the application of the developed model, the work risk of 5 communities was reduced in the 3 main groups: (1) Work illness reduced by 16.77%; (2) Pollution from work reduced by 10.31%; (3) Accidents and injuries from work reduced by 15.62%.

Keywords: occupational health, safety, local wisdom, Rattanakosin

Procedia PDF Downloads 430