Search results for: data analyses
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 27087

Search results for: data analyses

26157 Efficacy of Plant and Mushroom Based Bio-Products against the Red Poultry Mite, Dermanyssus gallinae (Mesostigmata: Dermanyssidae)

Authors: Muhammad Asif Qayyoum, Bilal Saeed Khan

Abstract:

Poultry red mites (Dermanyssus gallinae De Geer) are economically deleterious parasite of hens in poultry industry in all over the world. Due to lack of proper control managements and result of poor application of commercial products, D. gallinae get resistance and severe infestation in poultry birds. Laboratory experiment was planned for the control of D. gallinae by using different mushroom and plant extracts. We used control treatment (100 ml distilled water) and nine treatments (10 gr Lentinula adobas, Ganoderma lucidum and Pleurotus aryngii with 100 ml methanol, 1% and 2% Neemazal, 1.5% Gamma-T-ol, Echinacea Leaf , 1.5% Fungatol with neem spray and Methanol) with five replication having five mites each. Data collected after 12 and 24 hours every day till mites found dead in every treatment. The significant differences among the mean values were compared with the DUNCAN multiple range test. The efficacy (%) of each treatment was determined with the Abbott formula. All statistical analyses were conducted with the SPSS Version 12 program. Lentinula edodes (80%), Ganoderma lucidum (76%) and Fungatol+Neem spray (1.5%) (80%) were significant against D. gallinae within 3 days.

Keywords: mushroom extracts, plant extracts, D. gallinae, control

Procedia PDF Downloads 303
26156 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme

Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme

Procedia PDF Downloads 477
26155 Complex Learning Tasks and Their Impact on Cognitive Engagement for Undergraduate Engineering Students

Authors: Anastassis Kozanitis, Diane Leduc, Alain Stockless

Abstract:

This paper presents preliminary results from a two-year funded research program looking to analyze and understand the relationship between high cognitive engagement, higher order cognitive processes employed in situations of complex learning tasks, and the use of active learning pedagogies in engineering undergraduate programs. A mixed method approach was used to gauge student engagement and their cognitive processes when accomplishing complex tasks. Quantitative data collected from the self-report cognitive engagement scale shows that deep learning approach is positively correlated with high levels of complex learning tasks and the level of student engagement, in the context of classroom active learning pedagogies. Qualitative analyses of in depth face-to-face interviews reveal insights into the mechanisms influencing students’ cognitive processes when confronted with open-ended problem resolution. Findings also support evidence that students will adjust their level of cognitive engagement according to the specific didactic environment.

Keywords: cognitive engagement, deep and shallow strategies, engineering programs, higher order cognitive processes

Procedia PDF Downloads 319
26154 The Impact of Board Structure to the Roles of Board of Commissioners in Implementing Good Corporate Governance at Indonesian State-Owned Enterprises

Authors: Synthia Atas Sari, Engkos Achmad Kuncoro, Haryadi Sarjono

Abstract:

The purpose of this paper is to examine the impact of reward system which is determined by government over the work of Board of Commissioners in implementing good corporate governance in Indonesian state-owned enterprises. To do so, this study analyses the adequacy of the remuneration, the job attractiveness, and the board commitment and dedication with the remuneration system. Qualitative method used to examine the significant features and challenges to the government policy over the remuneration determination for the board of commissioners to their roles. Data are gathered through semi-structure in-depth interview to the 21 participants over 10 Indonesian stated-owned enterprises and written documents. Findings in this study indicate that government policies over the remuneration system is not effective to increase the performance of board of commissioners in implementing good corporate governance in Indonesian state-owned enterprises due to unattractiveness of the remuneration amount, demotivate active members, and conflict interest over members of the remuneration committee.

Keywords: reward system, board of commissioners, state-owned enterprises, good corporate governance

Procedia PDF Downloads 377
26153 Effect of PGPB Inoculation, Addition of Biochar and Mineral N Fertilization on Mycorrhizal Colonization

Authors: Irina Mikajlo, Jaroslav Záhora, Helena Dvořáčková, Jaroslav Hynšt, Jakub Elbl

Abstract:

Strong anthropogenic impact has uncontrolled consequences on the nature of the soil. Hence, up-to-date sustainable methods of soil state improvement are essential. Investigators provide the evidence that biochar can positively effects physical, chemical and biological soil properties and the abundance of mycorrhizal fungi which are in the focus of this study. The main aim of the present investigation is to demonstrate the effect of two types of plant growth promoting bacteria (PGPB) inoculums along with the beech wood biochar and mineral N additives on mycorrhizal colonization. Experiment has been set up in laboratory conditions with containers filled with arable soil from the protection zone of the main water source ‘Brezova nad Svitavou’. Lactuca sativa (lettuce) has been selected as a model plant. Based on the obtained data, it can be concluded that mycorrhizal colonization increased as the result of combined influence of biochar and PGPB inoculums amendment. In addition, correlation analyses showed that the numbers of main groups of cultivated bacteria were dependent on the degree of mycorrhizal colonization.

Keywords: Arbuscular mycorrhiza, biochar, PGPB inoculum, soil microorganisms

Procedia PDF Downloads 248
26152 ACTN3 R577X Polymorphism in Romany Children from Eastern Slovakia

Authors: Jarmila Bernasovska, Pavel Ružbarský, Ivan Bernasovsky, Regina Lohajová Behulová

Abstract:

The paper presents the results of the application of molecular genetics methods in sport research, with special emphasis on the most advanced methods and trends in diagnosing of motoric predispositions for the sake of identifying talented children. Genetic tests differ in principle from the traditional motoric tests, because the DNA of an individual does not change during life. Genetics is important in determining the capacity of an individual and for professional sport level. Genetic information can be used for individual genetic predispositions in early childhood. The phenotypes are influenced by a combination of genetic and environmental factors. The aim of the presented study was to examine physical condition, coordination skills, motoric docility and to determine the frequency of ACTN3 (R577X) gene in Romany children from Eastern Slovakia and compared their motoric performance with non-Romany children. This paper is not looking just for a performance, but also its association to genetic predispositions in relation to ACTN3 gene and its R577X polymorphism. Genotype data were obtained from 175 Romany children from 6 to 15 years old and 218 non-Romany children at the same age from Eastern Slovakia. Biological material for genetic analyses comprised samples of buccal swabs. Genotypes were determined using Real Time High resolution melting PCR method (Rotor Gene 6000 Corbett and LightCycler 480 Roche). Romany children of analyzed group legged to non-Romany children at the same age in all the compared tests. The % distribution of R and X alleles in children was different from controls. The frequency of XX genotype was 11,45% which is comparable to a frequency of an Indian population. Data were analysed with the ANOVA statistical programme and parametric and nonparametric tests. This work was supported by grants APVV-0716-10, ITMS 26220120023 and ITMS 26220120041.

Keywords: ACTN3 gene, R577X polymorphism, Romany children, sport performance, Slovakia

Procedia PDF Downloads 450
26151 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach

Authors: Sarisa Pinkham, Kanyarat Bussaban

Abstract:

The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.

Keywords: daily rainfall, image processing, approximation, pixel value data

Procedia PDF Downloads 383
26150 A Next-Generation Blockchain-Based Data Platform: Leveraging Decentralized Storage and Layer 2 Scaling for Secure Data Management

Authors: Kenneth Harper

Abstract:

The rapid growth of data-driven decision-making across various industries necessitates advanced solutions to ensure data integrity, scalability, and security. This study introduces a decentralized data platform built on blockchain technology to improve data management processes in high-volume environments such as healthcare and financial services. The platform integrates blockchain networks using Cosmos SDK and Polkadot Substrate alongside decentralized storage solutions like IPFS and Filecoin, and coupled with decentralized computing infrastructure built on top of Avalanche. By leveraging advanced consensus mechanisms, we create a scalable, tamper-proof architecture that supports both structured and unstructured data. Key features include secure data ingestion, cryptographic hashing for robust data lineage, and Zero-Knowledge Proof mechanisms that enhance privacy while ensuring compliance with regulatory standards. Additionally, we implement performance optimizations through Layer 2 scaling solutions, including ZK-Rollups, which provide low-latency data access and trustless data verification across a distributed ledger. The findings from this exercise demonstrate significant improvements in data accessibility, reduced operational costs, and enhanced data integrity when tested in real-world scenarios. This platform reference architecture offers a decentralized alternative to traditional centralized data storage models, providing scalability, security, and operational efficiency.

Keywords: blockchain, cosmos SDK, decentralized data platform, IPFS, ZK-Rollups

Procedia PDF Downloads 12
26149 Gaze Behaviour of Individuals with and without Intellectual Disability for Nonaccidental and Metric Shape Properties

Authors: S. Haider, B. Bhushan

Abstract:

Eye Gaze behaviour of individuals with and without intellectual disability are investigated in an eye tracking study in terms of sensitivity to Nonaccidental (NAPs) and Metric (MPs) shape properties. Total fixation time is used as an indirect measure of attention allocation. Studies have found Mean reaction times for non accidental properties (NAPs) to be shorter than for metric (MPs) when the MP and NAP differences were equalized. METHODS: Twenty-five individuals with intellectual disability (mild and moderate level of Mental Retardation) and twenty-seven normal individuals were compared on mean total fixation duration, accuracy level and mean reaction time for mild NAPs, extreme NAPs and metric properties of images. 2D images of cylinders were adapted and made into forced choice match-to-sample tasks. Tobii TX300 Eye Tracker was used to record total fixation duration and data obtained from the Areas of Interest (AOI). Variable trial duration (total reaction time of each participant) and fixed trail duration (data taken at each second from one to fifteen seconds) data were used for analyses. Both groups did not differ in terms of fixation times (fixed as well as variable) across any of the three image manipulations but differed in terms of reaction time and accuracy. Normal individuals had longer reaction time compared to individuals with intellectual disability across all types of images. Both the groups differed significantly on accuracy measure across all image types. Normal individuals performed better across all three types of images. Mild NAPs vs. Metric differences: There was significant difference between mild NAPs and metric properties of images in terms of reaction times. Mild NAPs images had significantly longer reaction time compared to metric for normal individuals but this difference was not found for individuals with intellectual disability. Mild NAPs images had significantly better accuracy level compared to metric for both the groups. In conclusion, type of image manipulations did not result in differences in attention allocation for individuals with and without intellectual disability. Mild Nonaccidental properties facilitate better accuracy level compared to metric in both the groups but this advantage is seen only for normal group in terms of mean reaction time.

Keywords: eye gaze fixations, eye movements, intellectual disability, stimulus properties

Procedia PDF Downloads 551
26148 Fluoride-Induced Stress and Its Association with Bone Developmental Pathway in Osteosarcoma Cells

Authors: Deepa Gandhi, Pravin K. Naoghare, Amit Bafana, Krishnamurthi Kannan, Saravanadevi Sivanesana

Abstract:

Oxidative stress is known to depreciate normal functioning of osteoblast cells. Present study reports oxidative/inflammatory signatures in fluoride exposed human osteosarcoma (HOS) cells and its possible association with the genes involved in bone developmental pathway. Microarray analysis was performed to understand the possible molecular mechanisms of stress-mediated bone lose in HOS cells. Cells were chronically exposed with sub-lethal concentration of fluoride. Global gene expression is profiling revealed 34 up regulated and 2598 down-regulated genes, which were associated with several biological processes including bone development, osteoblast differentiation, stress response, inflammatory response, apoptosis, regulation of cell proliferation. Microarray data were further validated through qRT-PCR and western blot analyses using key representative genes. Based on these findings, it can be proposed that chronic exposure of fluoride may impair bone development via oxidative and inflammatory stress. The present finding also provides important biological clues, which will be helpful for the development of therapeutic targets against diseases related bone.

Keywords: bone, HOS cells, microarray, stress

Procedia PDF Downloads 372
26147 The Effect of Measurement Distribution on System Identification and Detection of Behavior of Nonlinearities of Data

Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri

Abstract:

In this paper, we considered and applied parametric modeling for some experimental data of dynamical system. In this study, we investigated the different distribution of output measurement from some dynamical systems. Also, with variance processing in experimental data we obtained the region of nonlinearity in experimental data and then identification of output section is applied in different situation and data distribution. Finally, the effect of the spanning the measurement such as variance to identification and limitation of this approach is explained.

Keywords: Gaussian process, nonlinearity distribution, particle filter, system identification

Procedia PDF Downloads 507
26146 Building a Scalable Telemetry Based Multiclass Predictive Maintenance Model in R

Authors: Jaya Mathew

Abstract:

Many organizations are faced with the challenge of how to analyze and build Machine Learning models using their sensitive telemetry data. In this paper, we discuss how users can leverage the power of R without having to move their big data around as well as a cloud based solution for organizations willing to host their data in the cloud. By using ScaleR technology to benefit from parallelization and remote computing or R Services on premise or in the cloud, users can leverage the power of R at scale without having to move their data around.

Keywords: predictive maintenance, machine learning, big data, cloud based, on premise solution, R

Procedia PDF Downloads 370
26145 Trusting the Big Data Analytics Process from the Perspective of Different Stakeholders

Authors: Sven Gehrke, Johannes Ruhland

Abstract:

Data is the oil of our time, without them progress would come to a hold [1]. On the other hand, the mistrust of data mining is increasing [2]. The paper at hand shows different aspects of the concept of trust and describes the information asymmetry of the typical stakeholders of a data mining project using the CRISP-DM phase model. Based on the identified influencing factors in relation to trust, problematic aspects of the current approach are verified using various interviews with the stakeholders. The results of the interviews confirm the theoretically identified weak points of the phase model with regard to trust and show potential research areas.

Keywords: trust, data mining, CRISP DM, stakeholder management

Procedia PDF Downloads 90
26144 Wireless Transmission of Big Data Using Novel Secure Algorithm

Authors: K. Thiagarajan, K. Saranya, A. Veeraiah, B. Sudha

Abstract:

This paper presents a novel algorithm for secure, reliable and flexible transmission of big data in two hop wireless networks using cooperative jamming scheme. Two hop wireless networks consist of source, relay and destination nodes. Big data has to transmit from source to relay and from relay to destination by deploying security in physical layer. Cooperative jamming scheme determines transmission of big data in more secure manner by protecting it from eavesdroppers and malicious nodes of unknown location. The novel algorithm that ensures secure and energy balance transmission of big data, includes selection of data transmitting region, segmenting the selected region, determining probability ratio for each node (capture node, non-capture and eavesdropper node) in every segment, evaluating the probability using binary based evaluation. If it is secure transmission resume with the two- hop transmission of big data, otherwise prevent the attackers by cooperative jamming scheme and transmit the data in two-hop transmission.

Keywords: big data, two-hop transmission, physical layer wireless security, cooperative jamming, energy balance

Procedia PDF Downloads 483
26143 COVID–19 Impact on Passenger and Cargo Traffic: A Case Study

Authors: Maja Čović, Josipa Bojčić, Bruna Bacalja, Gorana Jelić Mrčelić

Abstract:

The appearance of the COVID-19 disease and its fast-spreading brought global pandemic and health crisis. In order to prevent the further spreading of the virus, the governments had implemented mobility restriction rules which left a negative mark on the world’s economy. Although there is numerous research on the impact of COVID-19 on marine traffic around the world, the objective of this paper is to consider the impact of COVID-19 on passenger and cargo traffic in Port of Split, in the Republic of Croatia. Methods used to make the theoretical and research part of the paper are descriptive method, comparative method, compilation, inductive method, deductive method, and statistical method. Paper relies on data obtained via Port of Split Authority and analyses trends in passenger and cargo traffic, including the year 2020, when the pandemic broke. Significant reductions in income, disruptions in transportation and traffic, as well as other maritime services are shown in the paper. This article also observes a significant decline in passenger traffic, cruising traffic and also observes the dynamic of cargo traffic inside the port of Split.

Keywords: COVID-19, pandemic, passenger traffic, ports, trends, cargo traffic

Procedia PDF Downloads 211
26142 Attitude towards the Consumption of Social Media: Analyzing Young Consumers’ Travel Behavior

Authors: Farzana Sharmin, Mohammad Tipu Sultan, Benqian Li

Abstract:

Advancement of new media technology and consumption of social media have altered the way of communication in the tourism industry, mostly for consumers’ travel planning, online purchase, and experience sharing activity. There is an accelerating trend among young consumers’ to utilize this new media technology. This paper aims to analyze the attitude of young consumers’ about social media use for travel purposes. The convenience random sample method used to collect data from an urban area of Shanghai (China), consists of 225 young consumers’. This survey identified behavioral determinants of social media consumption by the extended theory of planned behavior (TPB). The instrument developed support on previous research to test hypotheses. The results of structural analyses indicate that attitude towards the use of social media is affected by external factors such as availability and accessibility of technology. In addition, subjective norm and perceived behavioral control have partially influenced the attitude of respondents’. The results of this study could help to improve social media travel marketing and promotional strategies for respective groups.

Keywords: social media, theory of planned behavior, travel behavior, young consumer

Procedia PDF Downloads 188
26141 Islamic Banking: A New Trend towards the Development of Banking Law

Authors: Inese Tenberga

Abstract:

Undoubtedly, the focus of the present capitalist system of finance has shifted from the concept of productivity of money to the ‘cult of money’, which is characterized by such notions as speculative activity, squander, self-profit, vested interest, etc. The author is certain that a civilized society cannot follow this economic path any longer and therefore suggests that one solution would be to integrate the Islamic financial model in the banking sector of the EU to overcome its economic vulnerability and structurally transform its economies or build resilience against shocks and crisis. The researcher analyses the Islamic financial model, which is providing the basis for the concept of non-productivity of money, and proposes to consider it as a new paradigm of economic thinking. The author argues that it seeks to establish a broad-based economic well-being with an optimum rate of economic growth, socio-economic justice, equitable distribution of income and wealth. Furthermore, the author analyses and proposes to use the experience of member states of the Islamic Development Bank for the formation of a new EU interest free banking. It is offered to create within the EU banking system a credit sector and investment sector respectively. As a part of the latter, it is recommended to separate investment banks specializing in speculative investments and non­speculative investment banks. Meanwhile, understanding of the idea of Islamic banking exclusively from the perspective of the manner of yielding profit that differs from credit banking, without considering the legal, social, ethical guidelines of Islam impedes to value objectively the advantages of this type of financial activities at the non-Islamic jurisdictions. However, the author comes to the conclusion the imperative of justice and virtue, which is inherent to all of us, exists regardless of religion. The author concludes that the global community should adopt the experience of the Muslim countries and focus on the Islamic banking model.

Keywords: credit sector, EU banking system, investment sector, Islamic banking

Procedia PDF Downloads 171
26140 One Step Further: Pull-Process-Push Data Processing

Authors: Romeo Botes, Imelda Smit

Abstract:

In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.

Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list

Procedia PDF Downloads 238
26139 Extreme Temperature Forecast in Mbonge, Cameroon Through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, temperature extremes are forecast by employing the block maxima method of the generalized extreme value (GEV) distribution to analyse temperature data from the Cameroon Development Corporation (CDC). By considering two sets of data (raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data, while in the simulated data the return values show an increasing trend with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend with an upper bound. This clearly shows that although temperatures in the tropics show a sign of increase in the future, there is a maximum temperature at which there is no exceedance. The results of this paper are very vital in agricultural and environmental research.

Keywords: forecasting, generalized extreme value (GEV), meteorology, return level

Procedia PDF Downloads 475
26138 Impact of Stack Caches: Locality Awareness and Cost Effectiveness

Authors: Abdulrahman K. Alshegaifi, Chun-Hsi Huang

Abstract:

Treating data based on its location in memory has received much attention in recent years due to its different properties, which offer important aspects for cache utilization. Stack data and non-stack data may interfere with each other’s locality in the data cache. One of the important aspects of stack data is that it has high spatial and temporal locality. In this work, we simulate non-unified cache design that split data cache into stack and non-stack caches in order to maintain stack data and non-stack data separate in different caches. We observe that the overall hit rate of non-unified cache design is sensitive to the size of non-stack cache. Then, we investigate the appropriate size and associativity for stack cache to achieve high hit ratio especially when over 99% of accesses are directed to stack cache. The result shows that on average more than 99% of stack cache accuracy is achieved by using 2KB of capacity and 1-way associativity. Further, we analyze the improvement in hit rate when adding small, fixed, size of stack cache at level1 to unified cache architecture. The result shows that the overall hit rate of unified cache design with adding 1KB of stack cache is improved by approximately, on average, 3.9% for Rijndael benchmark. The stack cache is simulated by using SimpleScalar toolset.

Keywords: hit rate, locality of program, stack cache, stack data

Procedia PDF Downloads 299
26137 Autonomic Threat Avoidance and Self-Healing in Database Management System

Authors: Wajahat Munir, Muhammad Haseeb, Adeel Anjum, Basit Raza, Ahmad Kamran Malik

Abstract:

Databases are the key components of the software systems. Due to the exponential growth of data, it is the concern that the data should be accurate and available. The data in databases is vulnerable to internal and external threats, especially when it contains sensitive data like medical or military applications. Whenever the data is changed by malicious intent, data analysis result may lead to disastrous decisions. Autonomic self-healing is molded toward computer system after inspiring from the autonomic system of human body. In order to guarantee the accuracy and availability of data, we propose a technique which on a priority basis, tries to avoid any malicious transaction from execution and in case a malicious transaction affects the system, it heals the system in an isolated mode in such a way that the availability of system would not be compromised. Using this autonomic system, the management cost and time of DBAs can be minimized. In the end, we test our model and present the findings.

Keywords: autonomic computing, self-healing, threat avoidance, security

Procedia PDF Downloads 502
26136 Information Extraction Based on Search Engine Results

Authors: Mohammed R. Elkobaisi, Abdelsalam Maatuk

Abstract:

The search engines are the large scale information retrieval tools from the Web that are currently freely available to all. This paper explains how to convert the raw resulted number of search engines into useful information. This represents a new method for data gathering comparing with traditional methods. When a query is submitted for a multiple numbers of keywords, this take a long time and effort, hence we develop a user interface program to automatic search by taking multi-keywords at the same time and leave this program to collect wanted data automatically. The collected raw data is processed using mathematical and statistical theories to eliminate unwanted data and converting it to usable data.

Keywords: search engines, information extraction, agent system

Procedia PDF Downloads 423
26135 Predictability of Thermal Response in Housing: A Case Study in Australia, Adelaide

Authors: Mina Rouhollahi, J. Boland

Abstract:

Changes in cities’ heat balance due to rapid urbanization and the urban heat island (UHI) have increased energy demands for space cooling and have resulted in uncomfortable living conditions for urban residents. Climate resilience and comfortable living spaces can be addressed through well-designed urban development. The sustainable housing can be more effective in controlling high levels of urban heat. In Australia, to mitigate the effects of UHIs and summer heat waves, one solution to sustainable housing has been the trend to compact housing design and the construction of energy efficient dwellings. This paper analyses whether current housing configurations and orientations are effective in avoiding increased demands for air conditioning and having an energy efficient residential neighborhood. A significant amount of energy is consumed to ensure thermal comfort in houses. This paper reports on the modelling of heat transfer within the homes using the measurements of radiation, convection and conduction between exterior/interior wall surfaces and outdoor/indoor environment respectively. The simulation was tested on selected 7.5-star energy efficient houses constructed of typical material elements and insulation in Adelaide, Australia. The chosen design dwellings were analyzed in extremely hot weather through one year. The data were obtained via a thermal circuit to accurately model the fundamental heat transfer mechanisms on both boundaries of the house and through the multi-layered wall configurations. The formulation of the Lumped capacitance model was considered in discrete time steps by adopting a non-linear model method. The simulation results focused on the effects of orientation of the solar radiation on the dynamic thermal characteristics of the houses orientations. A high star rating did not necessarily coincide with a decrease in peak demands for cooling. A more effective approach to avoid increasing the demands for air conditioning and energy may be to integrate solar–climatic data to evaluate the performance of energy efficient houses.

Keywords: energy-efficient residential building, heat transfer, neighborhood orientation, solar–climatic data

Procedia PDF Downloads 128
26134 Contextualizing Household Food Security: A Comparison of Two Villages, Ambros and Maramanzhi, South Africa

Authors: Felicity Aphiwe Mkhongi, Walter Musakwa

Abstract:

Smallholder crop production is a defining factor in achieving food security, particularly at the household level. However, the number of abandoned arable fields is increasing in communal areas of South Africa. While substantial efforts have been devoted to addressing food insecurity in the country, ownership of arable land has not been supplemented with sustainable food production for households. This paper analyses household food security in the context of deagrarianization in two villages, Ambros (Eastern Cape) and Maramanzhi (Limpopo). Semi-structured questionnaires were administered to acquire both qualitative and quantitative data from 106 heads of households. The IBM SPSS Statistics 28.0 computer program was applied to complete data analysis. From the findings of the study, it was evident that compared to arable fields, a greater proportion of households own home-gardens with an average size of 2100m in Ambros and 3400m in Maramanzhi village. The majority of arable fields were abandoned, particularly in Ambros village. Household food access challenges were measured using the Household Food Insecurity Access Scale (HFIAS). This food security indicator revealed that the majority of households were mildly food insecure owing to food shortages emanating from insufficient monthly income and waning household crop production. Food was rated as a very important reason for engaging in cultivation in both villages of the study, but deagrarianization has eroded opportunities for increasing household crop production. Among other possible solutions, this study recommends that the government invest more in agriculture to allow for sustainable strategies that revive abandoned arable land, such as arable fields in communal areas of South Africa, as this could increase food production for households.

Keywords: cultivation, deagrarianization, food security, rural households, smallholder farmers

Procedia PDF Downloads 52
26133 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography

Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya

Abstract:

In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.

Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography

Procedia PDF Downloads 284
26132 The Prognostic Prediction Value of Positive Lymph Nodes Numbers for the Hypopharyngeal Squamous Cell Carcinoma

Authors: Wendu Pang, Yaxin Luo, Junhong Li, Yu Zhao, Danni Cheng, Yufang Rao, Minzi Mao, Ke Qiu, Yijun Dong, Fei Chen, Jun Liu, Jian Zou, Haiyang Wang, Wei Xu, Jianjun Ren

Abstract:

We aimed to compare the prognostic prediction value of positive lymph node number (PLNN) to the American Joint Committee on Cancer (AJCC) tumor, lymph node, and metastasis (TNM) staging system for patients with hypopharyngeal squamous cell carcinoma (HPSCC). A total of 826 patients with HPSCC from the Surveillance, Epidemiology, and End Results database (2004–2015) were identified and split into two independent cohorts: training (n=461) and validation (n=365). Univariate and multivariate Cox regression analyses were used to evaluate the prognostic effects of PLNN in patients with HPSCC. We further applied six Cox regression models to compare the survival predictive values of the PLNN and AJCC TNM staging system. PLNN showed a significant association with overall survival (OS) and cancer-specific survival (CSS) (P < 0.001) in both univariate and multivariable analyses, and was divided into three groups (PLNN 0, PLNN 1-5, and PLNN>5). In the training cohort, multivariate analysis revealed that the increased PLNN of HPSCC gave rise to significantly poor OS and CSS after adjusting for age, sex, tumor size, and cancer stage; this trend was also verified by the validation cohort. Additionally, the survival model incorporating a composite of PLNN and TNM classification (C-index, 0.705, 0.734) performed better than the PLNN and AJCC TNM models. PLNN can serve as a powerful survival predictor for patients with HPSCC and is a surrogate supplement for cancer staging systems.

Keywords: hypopharyngeal squamous cell carcinoma, positive lymph nodes number, prognosis, prediction models, survival predictive values

Procedia PDF Downloads 150
26131 Experiments on Weakly-Supervised Learning on Imperfect Data

Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler

Abstract:

Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.

Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation

Procedia PDF Downloads 195
26130 Transforming Healthcare Data Privacy: Integrating Blockchain with Zero-Knowledge Proofs and Cryptographic Security

Authors: Kenneth Harper

Abstract:

Blockchain technology presents solutions for managing healthcare data, addressing critical challenges in privacy, integrity, and access. This paper explores how privacy-preserving technologies, such as zero-knowledge proofs (ZKPs) and homomorphic encryption (HE), enhance decentralized healthcare platforms by enabling secure computations and patient data protection. An examination of the mathematical foundations of these methods, their practical applications, and how they meet the evolving demands of healthcare data security is unveiled. Using real-world examples, this research highlights industry-leading implementations and offers a roadmap for future applications in secure, decentralized healthcare ecosystems.

Keywords: blockchain, cryptography, data privacy, decentralized data management, differential privacy, healthcare, healthcare data security, homomorphic encryption, privacy-preserving technologies, secure computations, zero-knowledge proofs

Procedia PDF Downloads 10
26129 Operating Speed Models on Tangent Sections of Two-Lane Rural Roads

Authors: Dražen Cvitanić, Biljana Maljković

Abstract:

This paper presents models for predicting operating speeds on tangent sections of two-lane rural roads developed on continuous speed data. The data corresponds to 20 drivers of different ages and driving experiences, driving their own cars along an 18 km long section of a state road. The data were first used for determination of maximum operating speeds on tangents and their comparison with speeds in the middle of tangents i.e. speed data used in most of operating speed studies. Analysis of continuous speed data indicated that the spot speed data are not reliable indicators of relevant speeds. After that, operating speed models for tangent sections were developed. There was no significant difference between models developed using speed data in the middle of tangent sections and models developed using maximum operating speeds on tangent sections. All developed models have higher coefficient of determination then models developed on spot speed data. Thus, it can be concluded that the method of measuring has more significant impact on the quality of operating speed model than the location of measurement.

Keywords: operating speed, continuous speed data, tangent sections, spot speed, consistency

Procedia PDF Downloads 451
26128 Understanding Hydrodynamic in Lake Victoria Basin in a Catchment Scale: A Literature Review

Authors: Seema Paul, John Mango Magero, Prosun Bhattacharya, Zahra Kalantari, Steve W. Lyon

Abstract:

The purpose of this review paper is to develop an understanding of lake hydrodynamics and the potential climate impact on the Lake Victoria (LV) catchment scale. This paper briefly discusses the main problems of lake hydrodynamics and its’ solutions that are related to quality assessment and climate effect. An empirical methodology in modeling and mapping have considered for understanding lake hydrodynamic and visualizing the long-term observational daily, monthly, and yearly mean dataset results by using geographical information system (GIS) and Comsol techniques. Data were obtained for the whole lake and five different meteorological stations, and several geoprocessing tools with spatial analysis are considered to produce results. The linear regression analyses were developed to build climate scenarios and a linear trend on lake rainfall data for a long period. A potential evapotranspiration rate has been described by the MODIS and the Thornthwaite method. The rainfall effect on lake water level observed by Partial Differential Equations (PDE), and water quality has manifested by a few nutrients parameters. The study revealed monthly and yearly rainfall varies with monthly and yearly maximum and minimum temperatures, and the rainfall is high during cool years and the temperature is high associated with below and average rainfall patterns. Rising temperatures are likely to accelerate evapotranspiration rates and more evapotranspiration is likely to lead to more rainfall, drought is more correlated with temperature and cloud is more correlated with rainfall. There is a trend in lake rainfall and long-time rainfall on the lake water surface has affected the lake level. The onshore and offshore have been concentrated by initial literature nutrients data. The study recommended that further studies should consider fully lake bathymetry development with flow analysis and its’ water balance, hydro-meteorological processes, solute transport, wind hydrodynamics, pollution and eutrophication these are crucial for lake water quality, climate impact assessment, and water sustainability.

Keywords: climograph, climate scenarios, evapotranspiration, linear trend flow, rainfall event on LV, concentration

Procedia PDF Downloads 94