Search results for: data analyses
26202 Income-Consumption Relationships in Pakistan (1980-2011): A Cointegration Approach
Authors: Himayatullah Khan, Alena Fedorova
Abstract:
The present paper analyses the income-consumption relationships in Pakistan using annual time series data from 1980-81 to 2010-1. The paper uses the Augmented Dickey-Fuller test to check the unit root and stationarity in these two time series. The paper finds that the two time series are nonstationary but stationary at their first difference levels. The Augmented Engle-Granger test and the Cointegrating Regression Durbin-Watson test imply that the two time series of consumption and income are cointegrated and that long-run marginal propensity to consume is 0.88 which is given by the estimated (static) equilibrium relation. The paper also used the error correction mechanism to find out to model dynamic relationship. The purpose of the ECM is to indicate the speed of adjustment from the short-run equilibrium to the long-run equilibrium state. The results show that MPC is equal to 0.93 and is highly significant. The coefficient of Engle-Granger residuals is negative but insignificant. Statistically, the equilibrium error term is zero, which suggests that consumption adjusts to changes in GDP in the same period. The short-run changes in GDP have a positive impact on short-run changes in consumption. The paper concludes that we may interpret 0.93 as the short-run MPC. The pair-wise Granger Causality test shows that both GDP and consumption Granger cause each other.Keywords: cointegrating regression, Augmented Dickey Fuller test, Augmented Engle-Granger test, Granger causality, error correction mechanism
Procedia PDF Downloads 41426201 Integration Process and Analytic Interface of different Environmental Open Data Sets with Java/Oracle and R
Authors: Pavel H. Llamocca, Victoria Lopez
Abstract:
The main objective of our work is the comparative analysis of environmental data from Open Data bases, belonging to different governments. This means that you have to integrate data from various different sources. Nowadays, many governments have the intention of publishing thousands of data sets for people and organizations to use them. In this way, the quantity of applications based on Open Data is increasing. However each government has its own procedures to publish its data, and it causes a variety of formats of data sets because there are no international standards to specify the formats of the data sets from Open Data bases. Due to this variety of formats, we must build a data integration process that is able to put together all kind of formats. There are some software tools developed in order to give support to the integration process, e.g. Data Tamer, Data Wrangler. The problem with these tools is that they need data scientist interaction to take part in the integration process as a final step. In our case we don’t want to depend on a data scientist, because environmental data are usually similar and these processes can be automated by programming. The main idea of our tool is to build Hadoop procedures adapted to data sources per each government in order to achieve an automated integration. Our work focus in environment data like temperature, energy consumption, air quality, solar radiation, speeds of wind, etc. Since 2 years, the government of Madrid is publishing its Open Data bases relative to environment indicators in real time. In the same way, other governments have published Open Data sets relative to the environment (like Andalucia or Bilbao). But all of those data sets have different formats and our solution is able to integrate all of them, furthermore it allows the user to make and visualize some analysis over the real-time data. Once the integration task is done, all the data from any government has the same format and the analysis process can be initiated in a computational better way. So the tool presented in this work has two goals: 1. Integration process; and 2. Graphic and analytic interface. As a first approach, the integration process was developed using Java and Oracle and the graphic and analytic interface with Java (jsp). However, in order to open our software tool, as second approach, we also developed an implementation with R language as mature open source technology. R is a really powerful open source programming language that allows us to process and analyze a huge amount of data with high performance. There are also some R libraries for the building of a graphic interface like shiny. A performance comparison between both implementations was made and no significant differences were found. In addition, our work provides with an Official Real-Time Integrated Data Set about Environment Data in Spain to any developer in order that they can build their own applications.Keywords: open data, R language, data integration, environmental data
Procedia PDF Downloads 31526200 Contribution to the Study of Phenotypic, Reproduction and Growth Parameters of Sheep in Eastern Algeria
Authors: Mohammed Titaouine, Toufik Meziane, Kahramen Deghnouche, Hanane Mohamdi, Nabil Mohamdi
Abstract:
In order to better understand the morphological characters and the zootechniques measures of sheeps races in the in South-East Algeria, a study that was conducted on 1344 heads, taken from 8 farms in different parts of the region, namely T’kout 1, T’kout 2, Tafrent, Barika, Sidi-Okba, Biskra, Ouled-Djellal and Msila. The results from the present study showed significant differences in the group of 14 morphological studied variables, the body length is the most important variable. Reproduction performance of 160 ewes and growth performances of 56 lambs were analysed. The analyses of the data showed that the ewes have a fertility level of 69%, a prolificacy level of 114% and a fecundity level of 79%. Lambs weigh 3.5kg at birth, 9.38kg at 30d, 13.45kg at 60d, 16.91kg at 90d and 21.51 kg at 120d. The speed of the growth level 0.20kg/d from birth to 30d, 0.14 kg/d between 30d and 60d, 0.12kg/d between 60d and 90d and 0.15kg/d between 90d and 120d. The simple born lambs were more heavy than the double born lambs. By contrast, sex was not significant for all the variables except the weight at 60d, the birth month has a significant effect on the weight at birth, at 30d, at 60d and it was no significant for the weight at 90d and at 120d.The flocks born on September, October, November, and December were more heavy than the flocks born on January, February, and March.Keywords: morphological characterization, reproduction performance, growth performances, algeria
Procedia PDF Downloads 49826199 Hindrances to Effective Delivery of Infrastructural Development Projects in Nigeria’s Built Environment
Authors: Salisu Gidado Dalibi, Sadiq Gumi Abubakar, JingChun Feng
Abstract:
Nigeria’s population is about 190 million and is on the increase annually making it the seventh most populated nation in the world and first in Africa. This population growth comes with its prospects, needs, and challenges especially on the existing and future infrastructure. Infrastructure refers to structures, systems, and facilities serving the economy of a country, city, town, businesses, industries, etc. These include roads, railways lines, bridges, tunnels, ports, stadiums, dams and water projects, power generation plants and distribution grids, information, and communication technology (ICT), etc. The Nigerian government embarked on several infrastructural development projects (IDPs) to address the deficit as the present infrastructure cannot cater to the needs nor sustain the country. However, delivering such IDPs have not been smooth; comes with challenges from within and outside the project; frequent delays and abandonment. Thus, affecting all the stakeholders involved. Hence, the aim of this paper is to identify and assess the factors that are hindering the effective delivery of IDPs in Nigeria’s built environment with the view to offer more insight into such factors, and ways to address them. The methodology adopted in this study involves the use of secondary sources of data from several materials (official publications, journals, newspapers, internet, etc.) were reviewed within the IDPs field by laying more emphasis on Nigeria’s cases. The hindrance factors in this regard were identified which forms the backbone of the questionnaire. A pilot survey was used to test its suitability; after which it was randomly administered to various project professionals in Nigeria’s construction industry using a 5-point Likert scale format to ascertain the impact of these hindrances. Cronbach’s Alpha reliability test, mean item score computations, relative importance indices, T-test, Chi-Square statistics were used for data analyses. The results outline the impact of various internal, external and project related factors that are hindering IDPs within Nigeria’s built environment.Keywords: built environment, development, factors, hindrances, infrastructure, Nigeria, project
Procedia PDF Downloads 17726198 Transforming Data into Knowledge: Mathematical and Statistical Innovations in Data Analytics
Authors: Zahid Ullah, Atlas Khan
Abstract:
The rapid growth of data in various domains has created a pressing need for effective methods to transform this data into meaningful knowledge. In this era of big data, mathematical and statistical innovations play a crucial role in unlocking insights and facilitating informed decision-making in data analytics. This abstract aims to explore the transformative potential of these innovations and their impact on converting raw data into actionable knowledge. Drawing upon a comprehensive review of existing literature, this research investigates the cutting-edge mathematical and statistical techniques that enable the conversion of data into knowledge. By evaluating their underlying principles, strengths, and limitations, we aim to identify the most promising innovations in data analytics. To demonstrate the practical applications of these innovations, real-world datasets will be utilized through case studies or simulations. This empirical approach will showcase how mathematical and statistical innovations can extract patterns, trends, and insights from complex data, enabling evidence-based decision-making across diverse domains. Furthermore, a comparative analysis will be conducted to assess the performance, scalability, interpretability, and adaptability of different innovations. By benchmarking against established techniques, we aim to validate the effectiveness and superiority of the proposed mathematical and statistical innovations in data analytics. Ethical considerations surrounding data analytics, such as privacy, security, bias, and fairness, will be addressed throughout the research. Guidelines and best practices will be developed to ensure the responsible and ethical use of mathematical and statistical innovations in data analytics. The expected contributions of this research include advancements in mathematical and statistical sciences, improved data analysis techniques, enhanced decision-making processes, and practical implications for industries and policymakers. The outcomes will guide the adoption and implementation of mathematical and statistical innovations, empowering stakeholders to transform data into actionable knowledge and drive meaningful outcomes.Keywords: data analytics, mathematical innovations, knowledge extraction, decision-making
Procedia PDF Downloads 7526197 Neuroprotective Effect of Chrysin on Thioacetamide-Induced Hepatic Encephalopathy in Rats: Role of Oxidative Stress and TLR-4/NF-κB Pathway
Authors: S. A. El-Marasy, S. A. El Awdan, R. M. Abd-Elsalam
Abstract:
This study aimed to investigate the possible neuroprotective effect of chrysin on thioacetamide (TAA)-induced hepatic encephalopathy in rats. Also, the effect of chrysin on motor impairment, cognitive deficits, oxidative stress, neuroinflammation, apoptosis and histopathological damage was assessed. Male Wistar rats were randomly allocated into five groups. The first group received the vehicle (distilled water) for 21 days and is considered as normal group. While the second one received intraperitoneal dose of TAA (200 mg/kg) at three alternative days during the third week of the experiment to induce HE and is considered as control group. The other three groups were orally administered chrysin for 21 days (25, 50, 100 mg/kg) and starting from day 17; rats received intraperitoneal dose of TAA (200 mg/kg) at three alternative days. Then behavioral, biochemical, histopathological and immunohistochemical analyses were assessed. Then behavioral, biochemical, histopathological and immunohistochemical analyses were assessed. Chrysin reversed TAA-induced motor coordination in rotarod test, cognitive deficits in object recognition test (ORT) and attenuated serum ammonia, hepatic liver enzymes, reduced malondialdehyde (MDA), elevated reduced glutathione (GSH), reduced nuclear factor kappa B (NF-κB), tumor necrosis factor-alpha (TNF-α) and Interleukin-6 (IL-6) brain contents. Chrysin administration also reduced Toll-4 receptor (TLR-4) gene expression, caspase-3 protein expression, hepatic necrosis and astrocyte swelling. This study depicts that chrysin exerted neuroprotective effect in TAA-induced HE rats, evidenced by improvement of cognitive deficits, motor incoordination and histopathological changes such as astrocyte swelling and vacuolization; hallmarks in HE, via reducing hyperammonemia, ameliorating hepatic function, in addition to its anti-oxidant, inactivation of TLR-4/NF-κB inflammatory pathway, and anti-apoptotic effects.Keywords: chrysin, hepatic encephalopathy, oxidative stress, rats, thioacetamide, TLR4/NF-κB pathway
Procedia PDF Downloads 16126196 QTAIM View of Metal-Metal Bonding in Trinuclear Mixed-Metal Bridged Ligand Clusters Containing Ruthenium and Osmium
Authors: Nadia Ezzat Al-Kirbasee, Ahlam Hussein Hassan, Shatha Raheem Helal Alhimidi, Doaa Ezzat Al-Kirbasee, Muhsen Abood Muhsen Al-Ibadi
Abstract:
Through DFT/QTAIM calculations, we have provided new insights into the nature of the M-M, M-H, M-O, and M-C bonds of the (Cp*Ru)n(Cp*Os)3−n(μ3-O)2(μ-H)(Cp* = η5-C5Me5, n= 3,2,1,0). The topological analysis of the electron density reveals important details of the chemical bonding interactions in the clusters. Calculations confirm the absence of bond critical points (BCP) and the corresponding bond paths (BP) between Ru-Ru, Ru-Os, and Os-Os. The position of bridging hydrides and Oxo atoms coordinated to Ru-Ru, Ru-Os, and Os-Os determines the distribution of the electron densities and which strongly affects the formation of the bonds between these transition metal atoms. On the other hand, the results confirm that the four clusters contain a 6c–12e and 4c–2e bonding interaction delocalized over M3(μ-H)(μ-O)2 and M3(μ-H), respectively, as revealed by the non-negligible delocalization indexes calculations. The small values for electron density ρ(b) above zero, together with the small values, again above zero, for laplacian ∇2ρ(b) and the small negative values for total energy density H(b) are shown by the Ru-H, Os-H, Ru-O, and Os-O bonds in the four clusters are typical of open shell interactions. Also, the topological data for the bonds between Ru and Os atoms with the C atoms of the pentamethylcyclopentadienyl (Cp*) ring ligands are basically similar and show properties very consistent with open shell interactions in the QTAIM classification.Keywords: metal-metal and metal-ligand interactions, organometallic complexes, topological analysis, DFT and QTAIM analyses
Procedia PDF Downloads 9326195 FCNN-MR: A Parallel Instance Selection Method Based on Fast Condensed Nearest Neighbor Rule
Authors: Lu Si, Jie Yu, Shasha Li, Jun Ma, Lei Luo, Qingbo Wu, Yongqi Ma, Zhengji Liu
Abstract:
Instance selection (IS) technique is used to reduce the data size to improve the performance of data mining methods. Recently, to process very large data set, several proposed methods divide the training set into some disjoint subsets and apply IS algorithms independently to each subset. In this paper, we analyze the limitation of these methods and give our viewpoint about how to divide and conquer in IS procedure. Then, based on fast condensed nearest neighbor (FCNN) rule, we propose a large data sets instance selection method with MapReduce framework. Besides ensuring the prediction accuracy and reduction rate, it has two desirable properties: First, it reduces the work load in the aggregation node; Second and most important, it produces the same result with the sequential version, which other parallel methods cannot achieve. We evaluate the performance of FCNN-MR on one small data set and two large data sets. The experimental results show that it is effective and practical.Keywords: instance selection, data reduction, MapReduce, kNN
Procedia PDF Downloads 25326194 A Gap Analysis of Attitude Towards Sustainable Sportswear Product Development between Consumers and Suppliers
Authors: Y. N. Fung, R. Liu, T. M. Choi
Abstract:
Over the past decades, previous studies have explored different consumers’ attitudes towards sustainable fashion and how these attitudes affect consumer behaviors. Researchers have attempted to provide solutions for product suppliers (e.g., retailers, designers, developers, and manufacturers) through studying consumers’ attitudes towards sustainable fashion. However, based on the studies of consumer attitudes, investigations on the sales and market share of sustainable sportswear products remain under-explored. Gaps may exist between the consumers’ expectations and the developed sustainable sportswear products. In this study, a novel study has been carried out to examine the attitude gaps existing between the sustainable sportswear suppliers’ (SSSs) and the sustainable sportswear consumers (SSCs). This study firstly identifies the key attitudes towards sustainable sportswear product development. It analyses how sustainable attitudes affect the products being developed, as well as the effects of the attitude’s difference between the SSSs and the SSCs on the consumers’ satisfaction towards sportswear product consumption. A gap analysis research framework is adopted with the use of collected questionnaire survey data. The results indicate that a significant difference exists between SSSs and SSCs’ attitudes towards sustainable design, manufacture, product features, and branding. Based on in-depth interviews, the major causes of the difference in attitudes are studied to provide managerial insights for sustainable sportswear product management and business development.Keywords: sustainability, sportswear, attitude, gap analysis, suppliers, consumers
Procedia PDF Downloads 11426193 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking
Authors: Trevor Toy, Josef Langerman
Abstract:
Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.Keywords: big data markets, open banking, blockchain, personal data management
Procedia PDF Downloads 7326192 Locating the Best Place for Earthquake Refugee Camps by OpenSource Software: A Case Study for Tehran, Iran
Authors: Reyhaneh Saeedi
Abstract:
Iran is one of the regions which are most prone for earthquakes annually having a large number of financial and mortality and financial losses. Every year around the world, a large number of people lose their home and life due to natural disasters such as earthquakes. It is necessary to provide and specify some suitable places for settling the homeless people before the occurrence of the earthquake, one of the most important factors in crisis planning and management. Some of the natural disasters can be Modeling and shown by Geospatial Information System (GIS). By using GIS, it would be possible to manage the spatial data and reach several goals by making use of the analyses existing in it. GIS has a determining role in disaster management because it can determine the best places for temporary resettling after such a disaster. In this research QuantumGIS software is used that It is an OpenSource software so that easy to access codes and It is also free. In this system, AHP method is used as decision model and to locate the best places for temporary resettling, is done based on the related organizations criteria with their weights and buffers. Also in this research are made the buffer layers of criteria and change them to the raster layers. Later on, the raster layers are multiplied on desired weights then, the results are added together. Eventually, there are suitable places for resettling of victims by desired criteria by different colors with their optimum rate in QuantumGIS platform.Keywords: disaster management, temporary resettlement, earthquake, QuantumGIS
Procedia PDF Downloads 39826191 Experimental Evaluation of Succinct Ternary Tree
Authors: Dmitriy Kuptsov
Abstract:
Tree data structures, such as binary or in general k-ary trees, are essential in computer science. The applications of these data structures can range from data search and retrieval to sorting and ranking algorithms. Naive implementations of these data structures can consume prohibitively large volumes of random access memory limiting their applicability in certain solutions. Thus, in these cases, more advanced representation of these data structures is essential. In this paper we present the design of the compact version of ternary tree data structure and demonstrate the results for the experimental evaluation using static dictionary problem. We compare these results with the results for binary and regular ternary trees. The conducted evaluation study shows that our design, in the best case, consumes up to 12 times less memory (for the dictionary used in our experimental evaluation) than a regular ternary tree and in certain configuration shows performance comparable to regular ternary trees. We have evaluated the performance of the algorithms using both 32 and 64 bit operating systems.Keywords: algorithms, data structures, succinct ternary tree, per- formance evaluation
Procedia PDF Downloads 16026190 Design and Application of NFC-Based Identity and Access Management in Cloud Services
Authors: Shin-Jer Yang, Kai-Tai Yang
Abstract:
In response to a changing world and the fast growth of the Internet, more and more enterprises are replacing web-based services with cloud-based ones. Multi-tenancy technology is becoming more important especially with Software as a Service (SaaS). This in turn leads to a greater focus on the application of Identity and Access Management (IAM). Conventional Near-Field Communication (NFC) based verification relies on a computer browser and a card reader to access an NFC tag. This type of verification does not support mobile device login and user-based access management functions. This study designs an NFC-based third-party cloud identity and access management scheme (NFC-IAM) addressing this shortcoming. Data from simulation tests analyzed with Key Performance Indicators (KPIs) suggest that the NFC-IAM not only takes less time in identity identification but also cuts time by 80% in terms of two-factor authentication and improves verification accuracy to 99.9% or better. In functional performance analyses, NFC-IAM performed better in salability and portability. The NFC-IAM App (Application Software) and back-end system to be developed and deployed in mobile device are to support IAM features and also offers users a more user-friendly experience and stronger security protection. In the future, our NFC-IAM can be employed to different environments including identification for mobile payment systems, permission management for remote equipment monitoring, among other applications.Keywords: cloud service, multi-tenancy, NFC, IAM, mobile device
Procedia PDF Downloads 43526189 Comparison of High Speed Railway Bride Foundation Design
Authors: Hussein Yousif Aziz
Abstract:
This paper discussed the design and analysis of bridge foundation subjected to load of train with three codes, namely AASHTO code, British Standard BS Code 8004 (1986), and Chinese code (TB10002.5-2005).The study focused on the design and analysis of bridge’s foundation manually with the three codes and found which code is better for design and controls the problem of high settlement due to the applied loads. The results showed the Chinese codes are costly that the number of reinforcement bars in the pile cap and piles is more than those with AASHTO code and BS code with the same dimensions. Settlement of the bridge was calculated depending on the data collected from the project site. The vertical ultimate bearing capacity of single pile for three codes is also discussed. Other analyses by using the two-dimensional Plaxis program and other programs like SAP2000 14, PROKON many parameters are calculated. The maximum values of the vertical displacement are close to the calculated ones. The results indicate that the AASHTO code is economics and safer in the bearing capacity of single pile. The purpose of this project is to study out the pier on the basis of the design of the pile foundation. There is a 32m simply supported beam of box section on top of the structure. The pier of bridge is round-type. The main component of the design is to calculate pile foundation and the settlement. According to the related data, we choose 1.0m in diameter bored pile of 48m. The pile is laid out in the rectangular pile cap. The dimension of the cap is 12m 9 m. Because of the interaction factors of pile groups, the load-bearing capacity of simple pile must be checked, the punching resistance of pile cap, the shearing strength of pile cap, and the part in bending of pile cap, all of them are very important to the structure stability. Also, checking soft sub-bearing capacity is necessary under the pile foundation. This project provides a deeper analysis and comparison about pile foundation design schemes. Firstly, here are brief instructions of the construction situation about the Bridge. With the actual construction geological features and the upper load on the Bridge, this paper analyzes the bearing capacity and settlement of single pile. In the paper the Equivalent Pier Method is used to calculate and analyze settlements of the piles.Keywords: pile foundation, settlement, bearing capacity, civil engineering
Procedia PDF Downloads 42126188 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement
Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti
Abstract:
Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing
Procedia PDF Downloads 10826187 Use of Treated Municipal Wastewater on Artichoke Crop
Authors: G. Disciglio, G. Gatta, A. Libutti, A. Tarantino, L. Frabboni, E. Tarantino
Abstract:
Results of a field study carried out at Trinitapoli (Puglia region, southern Italy) on the irrigation of an artichoke crop with three types of water (secondary-treated wastewater, SW; tertiary-treated wastewater, TW; and freshwater, FW) are reported. Physical, chemical and microbiological analyses were performed on the irrigation water, and on soil and yield samples. The levels of most of the chemical parameters, such as electrical conductivity, total suspended solids, Na+, Ca2+, Mg+2, K+, sodium adsorption ratio, chemical oxygen demand, biological oxygen demand over 5 days, NO3 –N, total N, CO32, HCO3, phenols and chlorides of the applied irrigation water were significantly higher in SW compared to GW and TW. No differences were found for Mg2+, PO4-P, K+ only between SW and TW. Although the chemical parameters of the three irrigation water sources were different, few effects on the soil were observed. Even though monitoring of Escherichia coli showed high SW levels, which were above the limits allowed under Italian law (DM 152/2006), contamination of the soil and the marketable yield were never observed. Moreover, no Salmonella spp. were detected in these irrigation waters; consequently, they were absent in the plants. Finally, the data on the quantitative-qualitative parameters of the artichoke yield with the various treatments show no significant differences between the three irrigation water sources. Therefore, if adequately treated, municipal wastewater can be used for irrigation and represents a sound alternative to conventional water resources.Keywords: artichoke, soil chemical characteristics, fecal indicators, treated municipal wastewater, water recycling
Procedia PDF Downloads 42726186 Prosperous Digital Image Watermarking Approach by Using DCT-DWT
Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar
Abstract:
In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacksKeywords: watermarking, digital, DCT-DWT, security
Procedia PDF Downloads 42226185 Machine Learning Data Architecture
Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap
Abstract:
Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning
Procedia PDF Downloads 6426184 Critical Appraisal, Smart City Initiative: China vs. India
Authors: Suneet Jagdev, Siddharth Singhal, Dhrubajyoti Bordoloi, Peesari Vamshidhar Reddy
Abstract:
There is no universally accepted definition of what constitutes a Smart City. It means different things to different people. The definition varies from place to place depending on the level of development and the willingness of people to change and reform. It tries to improve the quality of resource management and service provisions for the people living in the cities. Smart city is an urban development vision to integrate multiple information and communication technology (ICT) solutions in a secure fashion to manage the assets of a city. But most of these projects are misinterpreted as being technology projects only. Due to urbanization, a lot of informal as well government funded settlements have come up during the last few decades, thus increasing the consumption of the limited resources available. The people of each city have their own definition of Smart City. In the imagination of any city dweller in India is the picture of a Smart City which contains a wish list of infrastructure and services that describe his or her level of aspiration. The research involved a comparative study of the Smart City models in India and in China. Behavioral changes experienced by the people living in the pilot/first ever smart cities have been identified and compared. This paper discussed what is the target of the quality of life for the people in India and in China and how well could that be realized with the facilities being included in these Smart City projects. Logical and comparative analyses of important data have been done, collected from government sources, government papers and research papers by various experts on the topic. Existing cities with historically grown infrastructure and administration systems will require a more moderate step-by-step approach to modernization. The models were compared using many different motivators and the data is collected from past journals, interacting with the people involved, videos and past submissions. In conclusion, we have identified how these projects could be combined with the ongoing small scale initiatives by the local people/ small group of individuals and what might be the outcome if these existing practices were implemented on a bigger scale.Keywords: behavior change, mission monitoring, pilot smart cities, social capital
Procedia PDF Downloads 28926183 The Relation between the Organizational Trust Level and Organizational Justice Perceptions of Staff in Konya Municipality: A Theoretical and Empirical Study
Authors: Handan Ertaş
Abstract:
The aim of the study is to determine the relationship between organizational trust level and organizational justice of Municipality officials. Correlational method has been used via descriptive survey model and Organizational Justice Perception Scale, Organizational Trust Inventory and Interpersonal Trust Scale have been applied to 353 participants who work in Konya Metropolitan Municipality and central district municipalities in the study. Frequency as statistical method, Independent Samples t test for binary groups, One Way-ANOVA analyses for multi-groups and Pearson Correlation analysis have been used to determine the relation in the data analysis process. It has been determined in the outcomes of the study that participants have high level of organizational trust, “Interpersonal Trust” is in the first place and there is a significant difference in the favor of male officials in terms of Trust on the Organization Itself and Interpersonal Trust. It has also been understood that officials in district municipalities have higher perception level in all dimensions, there is a significant difference in Trust on the Organization sub-dimension and work status is an important factor on organizational trust perception. Moreover, the study has shown that organizational justice implementations are important in raising trust of official on the organization, administrator and colleagues, and there is a parallel relation between Organizational Trust components and Organizational Trust dimensions.Keywords: organizational trust level, organizational justice perceptions, staff, Konya
Procedia PDF Downloads 34726182 The Prevalence of Musculoskeletal Disorders and Their Associated Factors among Nurses in Jordan
Authors: Khader A. Almhdawi, Hassan Alrabbaie
Abstract:
Background: Musculoskeletal disorders (MSDs) represent a significant challenge for registered nurses. To our best knowledge, there is no published study that investigated the prevalence of MSDs among nurses and their associated factors comprehensively in Jordan. This study aimed to find the prevalence of MSDs, their possible predictors among registered nurses in Jordanian hospitals. Methods: A cross-sectional design was used. Outcome measures included Nordic Musculoskeletal Questioner (NMQ), Depression Anxiety Stress Scale (DASS), Pittsburgh Sleep Quality Index (PSQI), IPAQ, and sociodemographic data. Prevalence of musculoskeletal complaints was reported using descriptive analysis. Logistic regression analyses were conducted to identify predictors of MSDs. Results: 597 nurses from different hospitals in Jordan participated in this study. Reported MSDs prevalence was the highest at neck (61.1%), followed by upper back (47.2%), shoulder (46.7%), wrist and hands (27.3%), and elbow (13.9%). Significant predictors of MSDs among Jordanian nurses included: being a female, poor sleep quality, high physical activity levels, poor ergonomics, increased workload, and mental stress. Conclusion: This study showed a high prevalence of MSDs among Jordanian nurses and identified their significant predictors. Future studies are needed to investigate the progressive nature of MSDs and their effective treatment strategies.Keywords: musculoskeletal disorders, nursing, ergonomic, occupational stress
Procedia PDF Downloads 10026181 A Comparison of Image Data Representations for Local Stereo Matching
Authors: André Smith, Amr Abdel-Dayem
Abstract:
The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.Keywords: colour data, local stereo matching, stereo correspondence, disparity map
Procedia PDF Downloads 37026180 Business-Intelligence Mining of Large Decentralized Multimedia Datasets with a Distributed Multi-Agent System
Authors: Karima Qayumi, Alex Norta
Abstract:
The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.Keywords: agent-oriented modeling (AOM), business intelligence model (BIM), distributed data mining (DDM), multi-agent system (MAS)
Procedia PDF Downloads 43226179 Evaluation of the Boiling Liquid Expanding Vapor Explosion Thermal Effects in Hassi R'Mel Gas Processing Plant Using Fire Dynamics Simulator
Authors: Brady Manescau, Ilyas Sellami, Khaled Chetehouna, Charles De Izarra, Rachid Nait-Said, Fati Zidani
Abstract:
During a fire in an oil and gas refinery, several thermal accidents can occur and cause serious damage to people and environment. Among these accidents, the BLEVE (Boiling Liquid Expanding Vapor Explosion) is most observed and remains a major concern for risk decision-makers. It corresponds to a violent vaporization of explosive nature following the rupture of a vessel containing a liquid at a temperature significantly higher than its normal boiling point at atmospheric pressure. Their effects on the environment generally appear in three ways: blast overpressure, radiation from the fireball if the liquid involved is flammable and fragment hazards. In order to estimate the potential damage that would be caused by such an explosion, risk decision-makers often use quantitative risk analysis (QRA). This analysis is a rigorous and advanced approach that requires a reliable data in order to obtain a good estimate and control of risks. However, in most cases, the data used in QRA are obtained from the empirical correlations. These empirical correlations generally overestimate BLEVE effects because they are based on simplifications and do not take into account real parameters like the geometry effect. Considering that these risk analyses are based on an assessment of BLEVE effects on human life and plant equipment, more precise and reliable data should be provided. From this point of view, the CFD modeling of BLEVE effects appears as a solution to the empirical law limitations. In this context, the main objective is to develop a numerical tool in order to predict BLEVE thermal effects using the CFD code FDS version 6. Simulations are carried out with a mesh size of 1 m. The fireball source is modeled as a vertical release of hot fuel in a short time. The modeling of fireball dynamics is based on a single step combustion using an EDC model coupled with the default LES turbulence model. Fireball characteristics (diameter, height, heat flux and lifetime) issued from the large scale BAM experiment are used to demonstrate the ability of FDS to simulate the various steps of the BLEVE phenomenon from ignition up to total burnout. The influence of release parameters such as the injection rate and the radiative fraction on the fireball heat flux is also presented. Predictions are very encouraging and show good agreement in comparison with BAM experiment data. In addition, a numerical study is carried out on an operational propane accumulator in an Algerian gas processing plant of SONATRACH company located in the Hassi R’Mel Gas Field (the largest gas field in Algeria).Keywords: BLEVE effects, CFD, FDS, fireball, LES, QRA
Procedia PDF Downloads 18626178 Timing and Noise Data Mining Algorithm and Software Tool in Very Large Scale Integration (VLSI) Design
Authors: Qing K. Zhu
Abstract:
Very Large Scale Integration (VLSI) design becomes very complex due to the continuous integration of millions of gates in one chip based on Moore’s law. Designers have encountered numerous report files during design iterations using timing and noise analysis tools. This paper presented our work using data mining techniques combined with HTML tables to extract and represent critical timing/noise data. When we apply this data-mining tool in real applications, the running speed is important. The software employs table look-up techniques in the programming for the reasonable running speed based on performance testing results. We added several advanced features for the application in one industry chip design.Keywords: VLSI design, data mining, big data, HTML forms, web, VLSI, EDA, timing, noise
Procedia PDF Downloads 25426177 An Exploration of Science, Technology, Engineering, Arts, and Mathematics Competition from the Perspective of Arts
Authors: Qiao Mao
Abstract:
There is a growing number of studies concerning STEM (Science, Technology, Engineering, and Mathematics) and STEAM (Science, Technology, Engineering, Arts, and Mathematics). However, the research is little on STEAM competitions from Arts' perspective. This study takes the annual PowerTech STEAM competition in Taiwan as an example. In this activity, students are asked to make wooden bionic mechanical beasts on the spot and participate in a model and speed competition. This study aims to explore how Arts influences STEM after it involves in the making of mechanical beasts. A case study method is adopted. Through expert sampling, five prize winners in the PowerTech Youth Science and Technology Creation Competition and their supervisors are taken as the research subjects. Relevant data which are collected, sorted out, analyzed and interpreted afterwards, derive from observations, interview and document analyses, etc. The results of the study show that in the PowerTech Youth Science and Technology Creation Competition, when Arts involves in STEM, (1) it has an impact on the athletic performance, balance, stability and symmetry of mechanical beasts; (2) students become more interested and more creative in making STEAM mechanical beasts, which can promote students' learning of STEM; (3) students encounter more difficulties and problems when making STEAM mechanical beasts, and need to have more systematic thinking and design thinking to solve problems.Keywords: PowerTech, STEAM contest, mechanical beast, arts' role
Procedia PDF Downloads 8626176 Introduction of Electronic Health Records to Improve Data Quality in Emergency Department Operations
Authors: Anuruddha Jagoda, Samiddhi Samarakoon, Anil Jasinghe
Abstract:
In its simplest form, data quality can be defined as 'fitness for use' and it is a concept with multi-dimensions. Emergency Departments(ED) require information to treat patients and on the other hand it is the primary source of information regarding accidents, injuries, emergencies etc. Also, it is the starting point of various patient registries, databases and surveillance systems. This interventional study was carried out to improve data quality at the ED of the National Hospital of Sri Lanka (NHSL) by introducing an e health solution to improve data quality. The NHSL is the premier trauma care centre in Sri Lanka. The study consisted of three components. A research study was conducted to assess the quality of data in relation to selected five dimensions of data quality namely accuracy, completeness, timeliness, legibility and reliability. The intervention was to develop and deploy an electronic emergency department information system (eEDIS). Post assessment of the intervention confirmed that all five dimensions of data quality had improved. The most significant improvements are noticed in accuracy and timeliness dimensions.Keywords: electronic health records, electronic emergency department information system, emergency department, data quality
Procedia PDF Downloads 27526175 Painting in Neolithic of Northwest Iberia: Archaeometrical Studies Applied to Megalithic Monuments
Authors: César Oliveira, Ana M. S. Bettencourt, Luciano Vilas Boas, Luís Gonçalves, Carlo Bottaini
Abstract:
Funerary megalithic monuments are probably under the most remarkable remains of the Neolithic period of western Europe. Some monuments are well known for their paintings, sometimes associated with engraved motifs, giving the funerary crypts a character of great symbolic value. The engraved and painted motifs, the colors used in the paintings, and the offerings associated with the deposited corpses are archaeological data that, being part of the funeral rites, also reveal the ideological world of these communities and their way of interacting with the world. In this sense, the choice of colors to be used in the paintings, the pigments collected, and the proceeds for making the paints would also be significant performances. The present study will focus on the characterization of painted art from megalithic monuments located in different areas of North-Western Portugal (coastal and inland). The colorant composition of megalithic barrows decorated with rock art motifs was studied using a multi-analytical approach (XRD, SEM-EDS, FTIR, and GC-MS), allowing the characterization of the painting techniques, pigments, and the organic compounds used as binders. Some analyses revealed that the pigments used for painting were produced using a collection of mined or quarried organic and inorganic substances. The results will be analyzed from the perspective of contingencies and regularity among the different case studies in order to interpret more or less standardized behaviors.Keywords: funerary megalithic monuments, painting motifs, archaeometrical studies, Northwest Iberia, behaviors
Procedia PDF Downloads 11026174 Evaluation of Golden Beam Data for the Commissioning of 6 and 18 MV Photons Beams in Varian Linear Accelerator
Authors: Shoukat Ali, Abdul Qadir Jandga, Amjad Hussain
Abstract:
Objective: The main purpose of this study is to compare the Percent Depth dose (PDD) and In-plane and cross-plane profiles of Varian Golden beam data to the measured data of 6 and 18 MV photons for the commissioning of Eclipse treatment planning system. Introduction: Commissioning of treatment planning system requires an extensive acquisition of beam data for the clinical use of linear accelerators. Accurate dose delivery require to enter the PDDs, Profiles and dose rate tables for open and wedges fields into treatment planning system, enabling to calculate the MUs and dose distribution. Varian offers a generic set of beam data as a reference data, however not recommend for clinical use. In this study, we compared the generic beam data with the measured beam data to evaluate the reliability of generic beam data to be used for the clinical purpose. Methods and Material: PDDs and Profiles of Open and Wedge fields for different field sizes and at different depths measured as per Varian’s algorithm commissioning guideline. The measurement performed with PTW 3D-scanning water phantom with semi-flex ion chamber and MEPHYSTO software. The online available Varian Golden Beam Data compared with the measured data to evaluate the accuracy of the golden beam data to be used for the commissioning of Eclipse treatment planning system. Results: The deviation between measured vs. golden beam data was in the range of 2% max. In PDDs, the deviation increases more in the deeper depths than the shallower depths. Similarly, profiles have the same trend of increasing deviation at large field sizes and increasing depths. Conclusion: Study shows that the percentage deviation between measured and golden beam data is within the acceptable tolerance and therefore can be used for the commissioning process; however, verification of small subset of acquired data with the golden beam data should be mandatory before clinical use.Keywords: percent depth dose, flatness, symmetry, golden beam data
Procedia PDF Downloads 48926173 Variable-Fidelity Surrogate Modelling with Kriging
Authors: Selvakumar Ulaganathan, Ivo Couckuyt, Francesco Ferranti, Tom Dhaene, Eric Laermans
Abstract:
Variable-fidelity surrogate modelling offers an efficient way to approximate function data available in multiple degrees of accuracy each with varying computational cost. In this paper, a Kriging-based variable-fidelity surrogate modelling approach is introduced to approximate such deterministic data. Initially, individual Kriging surrogate models, which are enhanced with gradient data of different degrees of accuracy, are constructed. Then these Gradient enhanced Kriging surrogate models are strategically coupled using a recursive CoKriging formulation to provide an accurate surrogate model for the highest fidelity data. While, intuitively, gradient data is useful to enhance the accuracy of surrogate models, the primary motivation behind this work is to investigate if it is also worthwhile incorporating gradient data of varying degrees of accuracy.Keywords: Kriging, CoKriging, Surrogate modelling, Variable- fidelity modelling, Gradients
Procedia PDF Downloads 558