Search results for: data to action
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26962

Search results for: data to action

24502 Development of Enhanced Data Encryption Standard

Authors: Benjamin Okike

Abstract:

There is a need to hide information along the superhighway. Today, information relating to the survival of individuals, organizations, or government agencies is transmitted from one point to another. Adversaries are always on the watch along the superhighway to intercept any information that would enable them to inflict psychological ‘injuries’ to their victims. But with information encryption, this can be prevented completely or at worst reduced to the barest minimum. There is no doubt that so many encryption techniques have been proposed, and some of them are already being implemented. However, adversaries always discover loopholes on them to perpetuate their evil plans. In this work, we propose the enhanced data encryption standard (EDES) that would deploy randomly generated numbers as an encryption method. Each time encryption is to be carried out, a new set of random numbers would be generated, thereby making it almost impossible for cryptanalysts to decrypt any information encrypted with this newly proposed method.

Keywords: encryption, enhanced data encryption, encryption techniques, information security

Procedia PDF Downloads 151
24501 Big Data Applications for Transportation Planning

Authors: Antonella Falanga, Armando Cartenì

Abstract:

"Big data" refers to extremely vast and complex sets of data, encompassing extraordinarily large and intricate datasets that require specific tools for meaningful analysis and processing. These datasets can stem from diverse origins like sensors, mobile devices, online transactions, social media platforms, and more. The utilization of big data is pivotal, offering the chance to leverage vast information for substantial advantages across diverse fields, thereby enhancing comprehension, decision-making, efficiency, and fostering innovation in various domains. Big data, distinguished by its remarkable attributes of enormous volume, high velocity, diverse variety, and significant value, represent a transformative force reshaping the industry worldwide. Their pervasive impact continues to unlock new possibilities, driving innovation and advancements in technology, decision-making processes, and societal progress in an increasingly data-centric world. The use of these technologies is becoming more widespread, facilitating and accelerating operations that were once much more complicated. In particular, big data impacts across multiple sectors such as business and commerce, healthcare and science, finance, education, geography, agriculture, media and entertainment and also mobility and logistics. Within the transportation sector, which is the focus of this study, big data applications encompass a wide variety, spanning across optimization in vehicle routing, real-time traffic management and monitoring, logistics efficiency, reduction of travel times and congestion, enhancement of the overall transportation systems, but also mitigation of pollutant emissions contributing to environmental sustainability. Meanwhile, in public administration and the development of smart cities, big data aids in improving public services, urban planning, and decision-making processes, leading to more efficient and sustainable urban environments. Access to vast data reservoirs enables deeper insights, revealing hidden patterns and facilitating more precise and timely decision-making. Additionally, advancements in cloud computing and artificial intelligence (AI) have further amplified the potential of big data, enabling more sophisticated and comprehensive analyses. Certainly, utilizing big data presents various advantages but also entails several challenges regarding data privacy and security, ensuring data quality, managing and storing large volumes of data effectively, integrating data from diverse sources, the need for specialized skills to interpret analysis results, ethical considerations in data use, and evaluating costs against benefits. Addressing these difficulties requires well-structured strategies and policies to balance the benefits of big data with privacy, security, and efficient data management concerns. Building upon these premises, the current research investigates the efficacy and influence of big data by conducting an overview of the primary and recent implementations of big data in transportation systems. Overall, this research allows us to conclude that big data better provide to enhance rational decision-making for mobility choices and is imperative for adeptly planning and allocating investments in transportation infrastructures and services.

Keywords: big data, public transport, sustainable mobility, transport demand, transportation planning

Procedia PDF Downloads 60
24500 Implementing Fault Tolerance with Proxy Signature on the Improvement of RSA System

Authors: H. El-Kamchouchi, Heba Gaber, Fatma Ahmed, Dalia H. El-Kamchouchi

Abstract:

Fault tolerance and data security are two important issues in modern communication systems. During the transmission of data between the sender and receiver, errors may occur frequently. Therefore, the sender must re-transmit the data to the receiver in order to correct these errors, which makes the system very feeble. To improve the scalability of the scheme, we present a proxy signature scheme with fault tolerance over an efficient and secure authenticated key agreement protocol based on the improved RSA system. Authenticated key agreement protocols have an important role in building a secure communications network between the two parties.

Keywords: fault tolerance, improved RSA, key agreement, proxy signature

Procedia PDF Downloads 425
24499 Epicatechin Metabolites and Its Effect on ROS Production in Bovine Aortic Endothelial Cells

Authors: Nasiruddin Khan

Abstract:

The action of (-)-epicatechin, a cocoa (Theobroma cacao) flavanol that modulates redox/oxidative stress are contributed mainly to their antioxidant properties. The present study investigates the concentration and time dependent effect of (-)-epicatechin metabolites 3MeEc, 4MeEc, and 4SulEc on the production of ROS on BAEC using L-012, Lucigenin as chemiluminescence dye and XO/HX system. Our result demonstrates that 3MeEc shows significant (P <0.05) lowering effect of ROS production in BAEC with increasing concentration of metabolite while L-012 was used as chemiluminescence dye but not in the case of Lucigenin. In XO/HX system, using L-012 as chemiluminescence dye, 3MeEc and 4MeEc showed significant lowering effect on ROS production with increasing concentration from 100-500nM as compared to the positive control (SOD). When Lucigenin was used as chemiluminescence dye, 3MeEc exerted significant lowering effect with increasing concentration when compared to the positive control (SOD) whereas 4MeEc showed significant lowering effect in ROS production from 250 nM on as compared to positive control. For 4SulEc, a significant lowering effect of ROS production was only observed at 100 and 250 nM. Overall, although each metabolite shows considerable effect, 3MeEc exhibited more pronounced effect on decreasing the production of ROS as compared to other two metabolites.

Keywords: epicatechin metabolites, HO-1, Nrf2, ROS

Procedia PDF Downloads 231
24498 Automatic Moderation of Toxic Comments in the Face of Local Language Complexity in Senegal

Authors: Edouard Ngor Sarr, Abel Diatta, Serigne Mor Toure, Ousmane Sall, Lamine Faty

Abstract:

Thanks to Web 2, we are witnessing a form of democratization of the spoken word, an exponential increase in the number of users on the web, but also, and above all, the accumulation of a daily flow of content that is becoming, at times, uncontrollable. Added to this is the rise of a violent social fabric characterised by hateful and racial comments, insults, and other content that contravenes social rules and the platforms' terms of use. Consequently, managing and regulating this mass of new content is proving increasingly difficult, requiring substantial human, technical, and technological resources. Without regulation and with the complicity of anonymity, this toxic content can pollute discussions and make these online spaces highly conducive to abuse, which very often has serious consequences for certain internet users, ranging from anxiety to suicide, depression, or withdrawal. The toxicity of a comment is defined as anything that is rude, disrespectful, or likely to cause someone to leave a discussion or to take violent action against a person or a community. Two levels of measures are needed to deal with this deleterious situation. The first measures are being taken by governments through draft laws with a dual objective: (i) to punish the perpetrators of these abuses and (ii) to make online platforms accountable for the mistakes made by their users. The second measure comes from the platforms themselves. By assessing the content left by users, they can set up filters to block and/or delete content or decide to suspend the user in question for good. However, the speed of discussions and the volume of data involved mean that platforms are unable to properly monitor the moderation of content produced by Internet users. That's why they use human moderators, either through recruitment or outsourcing. Moderating comments on the web means assessing and monitoring users‘ comments on online platforms in order to strike the right balance between protection against abuse and users’ freedom of expression. It makes it possible to determine which publications and users are allowed to remain online and which are deleted or suspended, how authorised publications are displayed, and what actions accompany content deletions. In this study, we look at the problem of automatic moderation of toxic comments in the face of local African languages and, more specifically, on social network comments in Senegal. We review the state of the art, highlighting the different approaches, algorithms, and tools for moderating comments. We also study the issues and challenges of moderation in the face of web ecosystems with lesser-known languages, such as local languages.

Keywords: moderation, local languages, Senegal, toxic comments

Procedia PDF Downloads 2
24497 The Necessity to Standardize Procedures of Providing Engineering Geological Data for Designing Road and Railway Tunneling Projects

Authors: Atefeh Saljooghi Khoshkar, Jafar Hassanpour

Abstract:

One of the main problems of the design stage relating to many tunneling projects is the lack of an appropriate standard for the provision of engineering geological data in a predefined format. In particular, this is more reflected in highway and railroad tunnel projects in which there is a number of tunnels and different professional teams involved. In this regard, comprehensive software needs to be designed using the accepted methods in order to help engineering geologists to prepare standard reports, which contain sufficient input data for the design stage. Regarding this necessity, applied software has been designed using macro capabilities and Visual Basic programming language (VBA) through Microsoft Excel. In this software, all of the engineering geological input data, which are required for designing different parts of tunnels, such as discontinuities properties, rock mass strength parameters, rock mass classification systems, boreability classification, the penetration rate, and so forth, can be calculated and reported in a standard format.

Keywords: engineering geology, rock mass classification, rock mechanic, tunnel

Procedia PDF Downloads 81
24496 SVM-RBN Model with Attentive Feature Culling Method for Early Detection of Fruit Plant Diseases

Authors: Piyush Sharma, Devi Prasad Sharma, Sulabh Bansal

Abstract:

Diseases are fairly common in fruits and vegetables because of the changing climatic and environmental circumstances. Crop diseases, which are frequently difficult to control, interfere with the growth and output of the crops. Accurate disease detection and timely disease control measures are required to guarantee high production standards and good quality. In India, apples are a common crop that may be afflicted by a variety of diseases on the fruit, stem, and leaves. It is fungi, bacteria, and viruses that trigger the early symptoms of leaf diseases. In order to assist farmers and take the appropriate action, it is important to develop an automated system that can be used to detect the type of illnesses. Machine learning-based image processing can be used to: this research suggested a system that can automatically identify diseases in apple fruit and apple plants. Hence, this research utilizes the hybrid SVM-RBN model. As a consequence, the model may produce results that are more effective in terms of accuracy, precision, recall, and F1 Score, with respective values of 96%, 99%, 94%, and 93%.

Keywords: fruit plant disease, crop disease, machine learning, image processing, SVM-RBN

Procedia PDF Downloads 64
24495 The Underestimate of the Annual Maximum Rainfall Depths Due to Coarse Time Resolution Data

Authors: Renato Morbidelli, Carla Saltalippi, Alessia Flammini, Tommaso Picciafuoco, Corrado Corradini

Abstract:

A considerable part of rainfall data to be used in the hydrological practice is available in aggregated form within constant time intervals. This can produce undesirable effects, like the underestimate of the annual maximum rainfall depth, Hd, associated with a given duration, d, that is the basic quantity in the development of rainfall depth-duration-frequency relationships and in determining if climate change is producing effects on extreme event intensities and frequencies. The errors in the evaluation of Hd from data characterized by a coarse temporal aggregation, ta, and a procedure to reduce the non-homogeneity of the Hd series are here investigated. Our results indicate that: 1) in the worst conditions, for d=ta, the estimation of a single Hd value can be affected by an underestimation error up to 50%, while the average underestimation error for a series with at least 15-20 Hd values, is less than or equal to 16.7%; 2) the underestimation error values follow an exponential probability density function; 3) each very long time series of Hd contains many underestimated values; 4) relationships between the non-dimensional ratio ta/d and the average underestimate of Hd, derived from continuous rainfall data observed in many stations of Central Italy, may overcome this issue; 5) these equations should allow to improve the Hd estimates and the associated depth-duration-frequency curves at least in areas with similar climatic conditions.

Keywords: central Italy, extreme events, rainfall data, underestimation errors

Procedia PDF Downloads 191
24494 Carl Schmitt in the Age of Immanence: A Critical Reading

Authors: Manuel Iretzberger

Abstract:

This paper aims to uncover the ideological aspects in the political thought of Carl Schmitt, who is enjoying an ever-increasing popularity in various academic fields, following in the wake of rising interest in questions of sovereignty and legitimacy. Given Schmitt’s biography, i.e. his role as the ‘Crown Jurist of the Third Reich’ (Gurian), an extraordinarily thorough examination is necessary; however, instead of merely ‘deconstructing’ his works, certain ontological truths he might have attained, shall be taken seriously. To this end, his notions of politics and the state of exception are scrutinized, which are indeed considered intriguing, yet prove to be enigmatic and impalpable at the core when read closely. In order to explain this conjuncture, both Schmitt’s philosophy of history and his ‘secret discussion’ (Agamben) with Walter Benjamin are depicted. As it turns out – it is argued – his concept of the Political has to be conceived of as embedded in a much broader context: In a post-transcendental, immanent age, he regards traditional modes of representation as no longer feasible and clings to authoritarianism as a surrogate – his Catholicism plays a decisive role here, forcing him to inject normatively biased assumptions into his political writings. Seeing Schmitt perform ‘rearguard action’ not only serves to disarm his work of most of its menacing aura, it also allows drawing conclusions about ways of legitimatizing democratic rule in modern times, as the paper tries to outline in its last section.

Keywords: Benjamin, history, immanence, Schmitt, sovereignty

Procedia PDF Downloads 294
24493 Objective Evaluation on Medical Image Compression Using Wavelet Transformation

Authors: Amhimmid Mohammed Saffour, Mustafa Mohamed Abdullah

Abstract:

The use of computers for handling image data in the healthcare is growing. However, the amount of data produced by modern image generating techniques is vast. This data might be a problem from a storage point of view or when the data is sent over a network. This paper using wavelet transform technique for medical images compression. MATLAB program, are designed to evaluate medical images storage and transmission time problem at Sebha Medical Center Libya. In this paper, three different Computed Tomography images which are abdomen, brain and chest have been selected and compressed using wavelet transform. Objective evaluation has been performed to measure the quality of the compressed images. For this evaluation, the results show that the Peak Signal to Noise Ratio (PSNR) which indicates the quality of the compressed image is ranging from (25.89db to 34.35db for abdomen images, 23.26db to 33.3db for brain images and 25.5db to 36.11db for chest images. These values shows that the compression ratio is nearly to 30:1 is acceptable.

Keywords: medical image, Matlab, image compression, wavelet's, objective evaluation

Procedia PDF Downloads 286
24492 Understanding the Qualitative Nature of Product Reviews by Integrating Text Processing Algorithm and Usability Feature Extraction

Authors: Cherry Yieng Siang Ling, Joong Hee Lee, Myung Hwan Yun

Abstract:

The quality of a product to be usable has become the basic requirement in consumer’s perspective while failing the requirement ends up the customer from not using the product. Identifying usability issues from analyzing quantitative and qualitative data collected from usability testing and evaluation activities aids in the process of product design, yet the lack of studies and researches regarding analysis methodologies in qualitative text data of usability field inhibits the potential of these data for more useful applications. While the possibility of analyzing qualitative text data found with the rapid development of data analysis studies such as natural language processing field in understanding human language in computer, and machine learning field in providing predictive model and clustering tool. Therefore, this research aims to study the application capability of text processing algorithm in analysis of qualitative text data collected from usability activities. This research utilized datasets collected from LG neckband headset usability experiment in which the datasets consist of headset survey text data, subject’s data and product physical data. In the analysis procedure, which integrated with the text-processing algorithm, the process includes training of comments onto vector space, labeling them with the subject and product physical feature data, and clustering to validate the result of comment vector clustering. The result shows 'volume and music control button' as the usability feature that matches best with the cluster of comment vectors where centroid comments of a cluster emphasized more on button positions, while centroid comments of the other cluster emphasized more on button interface issues. When volume and music control buttons are designed separately, the participant experienced less confusion, and thus, the comments mentioned only about the buttons' positions. While in the situation where the volume and music control buttons are designed as a single button, the participants experienced interface issues regarding the buttons such as operating methods of functions and confusion of functions' buttons. The relevance of the cluster centroid comments with the extracted feature explained the capability of text processing algorithms in analyzing qualitative text data from usability testing and evaluations.

Keywords: usability, qualitative data, text-processing algorithm, natural language processing

Procedia PDF Downloads 285
24491 Differentiation between Different Rangeland Sites Using Principal Component Analysis in Semi-Arid Areas of Sudan

Authors: Nancy Ibrahim Abdalla, Abdelaziz Karamalla Gaiballa

Abstract:

Rangelands in semi-arid areas provide a good source for feeding huge numbers of animals and serving environmental, economic and social importance; therefore, these areas are considered economically very important for the pastoral sector in Sudan. This paper investigates the means of differentiating between different rangelands sites according to soil types using principal component analysis to assist in monitoring and assessment purposes. Three rangeland sites were identified in the study area as flat sandy sites, sand dune site, and hard clay site. Principal component analysis (PCA) was used to reduce the number of factors needed to distinguish between rangeland sites and produce a new set of data including the most useful spectral information to run satellite image processing. It was performed using selected types of data (two vegetation indices, topographic data and vegetation surface reflectance within the three bands of MODIS data). Analysis with PCA indicated that there is a relatively high correspondence between vegetation and soil of the total variance in the data set. The results showed that the use of the principal component analysis (PCA) with the selected variables showed a high difference, reflected in the variance and eigenvalues and it can be used for differentiation between different range sites.

Keywords: principal component analysis, PCA, rangeland sites, semi-arid areas, soil types

Procedia PDF Downloads 186
24490 The Use of Optical-Radar Remotely-Sensed Data for Characterizing Geomorphic, Structural and Hydrologic Features and Modeling Groundwater Prospective Zones in Arid Zones

Authors: Mohamed Abdelkareem

Abstract:

Remote sensing data contributed on predicting the prospective areas of water resources. Integration of microwave and multispectral data along with climatic, hydrologic, and geological data has been used here. In this article, Sentinel-2, Landsat-8 Operational Land Imager (OLI), Shuttle Radar Topography Mission (SRTM), Tropical Rainfall Measuring Mission (TRMM), and Advanced Land Observing Satellite (ALOS) Phased Array Type L‐band Synthetic Aperture Radar (PALSAR) data were utilized to identify the geological, hydrologic and structural features of Wadi Asyuti which represents a defunct tributary of the Nile basin, in the eastern Sahara. The image transformation of Sentinel-2 and Landsat-8 data allowed characterizing the different varieties of rock units. Integration of microwave remotely-sensed data and GIS techniques provided information on physical characteristics of catchments and rainfall zones that are of a crucial role for mapping groundwater prospective zones. A fused Landsat-8 OLI and ALOS/PALSAR data improved the structural elements that difficult to reveal using optical data. Lineament extraction and interpretation indicated that the area is clearly shaped by the NE-SW graben that is cut by NW-SE trend. Such structures allowed the accumulation of thick sediments in the downstream area. Processing of recent OLI data acquired on March 15, 2014, verified the flood potential maps and offered the opportunity to extract the extent of the flooding zone of the recent flash flood event (March 9, 2014), as well as revealed infiltration characteristics. Several layers including geology, slope, topography, drainage density, lineament density, soil characteristics, rainfall, and morphometric characteristics were combined after assigning a weight for each using a GIS-based knowledge-driven approach. The results revealed that the predicted groundwater potential zones (GPZs) can be arranged into six distinctive groups, depending on their probability for groundwater, namely very low, low, moderate, high very, high, and excellent. Field and well data validated the delineated zones.

Keywords: GIS, remote sensing, groundwater, Egypt

Procedia PDF Downloads 98
24489 Intelligent Production Machine

Authors: A. Şahinoğlu, R. Gürbüz, A. Güllü, M. Karhan

Abstract:

This study in production machines, it is aimed that machine will automatically perceive cutting data and alter cutting parameters. The two most important parameters have to be checked in machine control unit are progress feed rate and speeds. These parameters are aimed to be controlled by sounds of machine. Optimum sound’s features introduced to computer. During process, real time data is received and converted by Matlab software. Data is converted into numerical values. According to them progress and speeds decreases/increases at a certain rate and thus optimum sound is acquired. Cutting process is made in respect of optimum cutting parameters. During chip remove progress, features of cutting tools, kind of cut material, cutting parameters and used machine; affects on various parameters. Instead of required parameters need to be measured such as temperature, vibration, and tool wear that emerged during cutting process; detailed analysis of the sound emerged during cutting process will provide detection of various data that included in the cutting process by the much more easy and economic way. The relation between cutting parameters and sound is being identified.

Keywords: cutting process, sound processing, intelligent late, sound analysis

Procedia PDF Downloads 334
24488 The Effectiveness and Accuracy of the Schulte Holt IOL Toric Calculator Processor in Comparison to Manually Input Data into the Barrett Toric IOL Calculator

Authors: Gabrielle Holt

Abstract:

This paper is looking to prove the efficacy of the Schulte Holt IOL Toric Calculator Processor (Schulte Holt ITCP). It has been completed using manually inputted data into the Barrett Toric Calculator and comparing the number of minutes taken to complete the Toric calculations, the number of errors identified during completion, and distractions during completion. It will then compare that data to the number of minutes taken for the Schulte Holt ITCP to complete also, using the Barrett method, as well as the number of errors identified in the Schulte Holt ITCP. The data clearly demonstrate a momentous advantage to the Schulte Holt ITCP and notably reduces time spent doing Toric Calculations, as well as reducing the number of errors. With the ever-growing number of cataract surgeries taking place around the world and the waitlists increasing -the Schulte Holt IOL Toric Calculator Processor may well demonstrate a way forward to increase the availability of ophthalmologists and ophthalmic staff while maintaining patient safety.

Keywords: Toric, toric lenses, ophthalmology, cataract surgery, toric calculations, Barrett

Procedia PDF Downloads 94
24487 Change Point Detection Using Random Matrix Theory with Application to Frailty in Elderly Individuals

Authors: Malika Kharouf, Aly Chkeir, Khac Tuan Huynh

Abstract:

Detecting change points in time series data is a challenging problem, especially in scenarios where there is limited prior knowledge regarding the data’s distribution and the nature of the transitions. We present a method designed for detecting changes in the covariance structure of high-dimensional time series data, where the number of variables closely matches the data length. Our objective is to achieve unbiased test statistic estimation under the null hypothesis. We delve into the utilization of Random Matrix Theory to analyze the behavior of our test statistic within a high-dimensional context. Specifically, we illustrate that our test statistic converges pointwise to a normal distribution under the null hypothesis. To assess the effectiveness of our proposed approach, we conduct evaluations on a simulated dataset. Furthermore, we employ our method to examine changes aimed at detecting frailty in the elderly.

Keywords: change point detection, hypothesis tests, random matrix theory, frailty in elderly

Procedia PDF Downloads 54
24486 Main Cause of Children's Deaths in Indigenous Wayuu Community from Department of La Guajira: A Research Developed through Data Mining Use

Authors: Isaura Esther Solano Núñez, David Suarez

Abstract:

The main purpose of this research is to discover what causes death in children of the Wayuu community, and deeply analyze those results in order to take corrective measures to properly control infant mortality. We consider important to determine the reasons that are producing early death in this specific type of population, since they are the most vulnerable to high risk environmental conditions. In this way, the government, through competent authorities, may develop prevention policies and the right measures to avoid an increase of this tragic fact. The methodology used to develop this investigation is data mining, which consists in gaining and examining large amounts of data to produce new and valuable information. Through this technique it has been possible to determine that the child population is dying mostly from malnutrition. In short, this technique has been very useful to develop this study; it has allowed us to transform large amounts of information into a conclusive and important statement, which has made it easier to take appropriate steps to resolve a particular situation.

Keywords: malnutrition, data mining, analytical, descriptive, population, Wayuu, indigenous

Procedia PDF Downloads 159
24485 Application of the Mobile Phone for Occupational Self-Inspection Program in Small-Scale Industries

Authors: Jia-Sin Li, Ying-Fang Wang, Cheing-Tong Yan

Abstract:

In this study, an integrated approach of Google Spreadsheet and QR code which is free internet resources was used to improve the inspection procedure. The mobile phone Application(App)was also designed to combine with a web page to create an automatic checklist in order to provide a new integrated information of inspection management system. By means of client-server model, the client App is developed for Android mobile OS and the back end is a web server. It can set up App accounts including authorized data and store some checklist documents in the website. The checklist document URL could generate QR code first and then print and paste on the machine. The user can scan the QR code by the app and filled the checklist in the factory. In the meanwhile, the checklist data will send to the server, it not only save the filled data but also executes the related functions and charts. On the other hand, it also enables auditors and supervisors to facilitate the prevention and response to hazards, as well as immediate report data checks. Finally, statistics and professional analysis are performed using inspection records and other relevant data to not only improve the reliability, integrity of inspection operations and equipment loss control, but also increase plant safety and personnel performance. Therefore, it suggested that the traditional paper-based inspection method could be replaced by the APP which promotes the promotion of industrial security and reduces human error.

Keywords: checklist, Google spreadsheet, APP, self-inspection

Procedia PDF Downloads 118
24484 Industry 4.0 and Supply Chain Integration: Case of Tunisian Industrial Companies

Authors: Rym Ghariani, Ghada Soltane, Younes Boujelbene

Abstract:

Industry 4.0, a set of emerging smart and digital technologies, has been the main focus of operations management researchers and practitioners in recent years. The objective of this research paper is to study the impact of Industry 4.0 on the integration of the supply chain (SCI) in Tunisian industrial companies. A conceptual model to study the relationship between Industry 4.0 technologies and supply chain integration was designed. This model contains three explained variables (Big data, Internet of Things, and Robotics) and one variable to be explained (supply chain integration). In order to answer our research questions and investigate the research hypotheses, principal component analysis and discriminant analysis were used using SPSS26 software. The results reveal that there is a statistically positive impact significant impact of Industry 4.0 (Big data, Internet of Things and Robotics) on the integration of the supply chain. Interestingly, big data has a greater positive impact on supply chain integration than the Internet of Things and robotics.

Keywords: industry 4.0 (I4.0), big data, internet of things, robotics, supply chain integration

Procedia PDF Downloads 60
24483 Social Structure of Corporate Social Responsibility Programme in Pantai Harapan Jaya Village, Bekasi Regency, West Java

Authors: Auliya Adzilatin Uzhma, Ismu Rini Dwi, I. Nyoman Suluh Wijaya

Abstract:

Corporate Social Responsibility (CSR) programme in Pantai Harapan Jaya village is cultivation of mangrove and fishery capital distribution, to achieve the goal the CSR programme needed participation from the society in it. Moeliono in Fahrudin (2011) mentioned that participation from society is based by intrinsic reason from inside people it self and extrinsic reason from the other who related to him. The fundamental connection who caused more boundaries from action which the organization can do called the social structure. The purpose of this research is to know the form of public participation and the social structure typology of the villager and people who is participated in CSR programme. The key actors of the society and key actors of the people who’s participated also can be known. This research use Social Network Analysis method by knew the Rate of Participation, Density and Centrality. The result of the research is people who is involved in the programme is lived in Dusun Pondok Dua and they work in fisheries field. The density value from the participant is 0.516 it’s mean that 51.6% of the people that participated is involved in the same step of CSR programme.

Keywords: social structure, social network analysis, corporate social responsibility, public participation

Procedia PDF Downloads 480
24482 Bilateral Telecontrol of AutoMerlin Mobile Robot Using Time Domain Passivity Control

Authors: Aamir Shahzad, Hubert Roth

Abstract:

This paper is presenting the bilateral telecontrol of AutoMerlin Mobile Robot having communication delay. Passivity Observers has been designed to monitor the net energy at both ports of a two port network and if any or both ports become active making net energy negative, then the passivity controllers dissipate the proper energy to make the overall system passive in the presence of time delay. The environment force is modeled and sent back to human operator so that s/he can feel it and has additional information about the environment in the vicinity of mobile robot. The experimental results have been presented to show the performance and stability of bilateral controller. The results show the whenever the passivity observers observe active behavior then the passivity controller come into action to neutralize the active behavior to make overall system passive.

Keywords: bilateral control, human operator, haptic device, communication network, time domain passivity control, passivity observer, passivity controller, time delay, mobile robot, environment force

Procedia PDF Downloads 393
24481 Analysing Competitive Advantage of IoT and Data Analytics in Smart City Context

Authors: Petra Hofmann, Dana Koniel, Jussi Luukkanen, Walter Nieminen, Lea Hannola, Ilkka Donoghue

Abstract:

The Covid-19 pandemic forced people to isolate and become physically less connected. The pandemic has not only reshaped people’s behaviours and needs but also accelerated digital transformation (DT). DT of cities has become an imperative with the outlook of converting them into smart cities in the future. Embedding digital infrastructure and smart city initiatives as part of normal design, construction, and operation of cities provides a unique opportunity to improve the connection between people. The Internet of Things (IoT) is an emerging technology and one of the drivers in DT. It has disrupted many industries by introducing different services and business models, and IoT solutions are being applied in multiple fields, including smart cities. As IoT and data are fundamentally linked together, IoT solutions can only create value if the data generated by the IoT devices is analysed properly. Extracting relevant conclusions and actionable insights by using established techniques, data analytics contributes significantly to the growth and success of IoT applications and investments. Companies must grasp DT and be prepared to redesign their offerings and business models to remain competitive in today’s marketplace. As there are many IoT solutions available today, the amount of data is tremendous. The challenge for companies is to understand what solutions to focus on and how to prioritise and which data to differentiate from the competition. This paper explains how IoT and data analytics can impact competitive advantage and how companies should approach IoT and data analytics to translate them into concrete offerings and solutions in the smart city context. The study was carried out as a qualitative, literature-based research. A case study is provided to validate the preservation of company’s competitive advantage through smart city solutions. The results of the research contribution provide insights into the different factors and considerations related to creating competitive advantage through IoT and data analytics deployment in the smart city context. Furthermore, this paper proposes a framework that merges the factors and considerations with examples of offerings and solutions in smart cities. The data collected through IoT devices, and the intelligent use of it, can create competitive advantage to companies operating in smart city business. Companies should take into consideration the five forces of competition that shape industries and pay attention to the technological, organisational, and external contexts which define factors for consideration of competitive advantages in the field of IoT and data analytics. Companies that can utilise these key assets in their businesses will most likely conquer the markets and have a strong foothold in the smart city business.

Keywords: data analytics, smart cities, competitive advantage, internet of things

Procedia PDF Downloads 134
24480 Best Season for Seismic Survey in Zaria Area, Nigeria: Data Quality and Implications

Authors: Ibe O. Stephen, Egwuonwu N. Gabriel

Abstract:

Variations in seismic P-wave velocity and depth resolution resulting from variations in subsurface water saturation were investigated in this study in order to determine the season of the year that gives the most reliable P-wave velocity and depth resolution of the subsurface in Zaria Area, Nigeria. A 2D seismic refraction tomography technique involving an ABEM Terraloc MK6 Seismograph was used to collect data across a borehole of standard log with the centre of the spread situated at the borehole site. Using the same parameters this procedure was repeated along the same spread for at least once in a month for at least eight months in a year for four years. The choice for each survey time depended on when there was significant variation in rainfall data. The seismic data collected were tomographically inverted. The results suggested that the average P-wave velocity ranges of the subsurface in the area are generally higher when the ground was wet than when it was dry. The results also suggested that the overburden of about 9.0 m in thickness, the weathered basement of about 14.0 m in thickness and the fractured basement at a depth of about 23.0 m best fitted the borehole log. This best fit was consistently obtained in the months between March and May when the average total rainfall was about 44.8 mm in the area. The results had also shown that the velocity ranges in both dry and wet formations fall within the standard ranges as provided in literature. In terms of velocity, this study has not in any way clearly distinguished the quality of the results of the seismic data obtained when the subsurface was dry from the results of the data collected when the subsurface was wet. It was concluded that for more detailed and reliable seismic studies in Zaria Area and its environs with similar climatic condition, the surveys are best conducted between March and May. The most reliable seismic data for depth resolution are most likely obtainable in the area between March and May.

Keywords: best season, variations in depth resolution, variations in P-wave velocity, variations in subsurface water saturation, Zaria area

Procedia PDF Downloads 290
24479 Quick Sequential Search Algorithm Used to Decode High-Frequency Matrices

Authors: Mohammed M. Siddeq, Mohammed H. Rasheed, Omar M. Salih, Marcos A. Rodrigues

Abstract:

This research proposes a data encoding and decoding method based on the Matrix Minimization algorithm. This algorithm is applied to high-frequency coefficients for compression/encoding. The algorithm starts by converting every three coefficients to a single value; this is accomplished based on three different keys. The decoding/decompression uses a search method called QSS (Quick Sequential Search) Decoding Algorithm presented in this research based on the sequential search to recover the exact coefficients. In the next step, the decoded data are saved in an auxiliary array. The basic idea behind the auxiliary array is to save all possible decoded coefficients; this is because another algorithm, such as conventional sequential search, could retrieve encoded/compressed data independently from the proposed algorithm. The experimental results showed that our proposed decoding algorithm retrieves original data faster than conventional sequential search algorithms.

Keywords: matrix minimization algorithm, decoding sequential search algorithm, image compression, DCT, DWT

Procedia PDF Downloads 150
24478 Cleaning of Scientific References in Large Patent Databases Using Rule-Based Scoring and Clustering

Authors: Emiel Caron

Abstract:

Patent databases contain patent related data, organized in a relational data model, and are used to produce various patent statistics. These databases store raw data about scientific references cited by patents. For example, Patstat holds references to tens of millions of scientific journal publications and conference proceedings. These references might be used to connect patent databases with bibliographic databases, e.g. to study to the relation between science, technology, and innovation in various domains. Problematic in such studies is the low data quality of the references, i.e. they are often ambiguous, unstructured, and incomplete. Moreover, a complete bibliographic reference is stored in only one attribute. Therefore, a computerized cleaning and disambiguation method for large patent databases is developed in this work. The method uses rule-based scoring and clustering. The rules are based on bibliographic metadata, retrieved from the raw data by regular expressions, and are transparent and adaptable. The rules in combination with string similarity measures are used to detect pairs of records that are potential duplicates. Due to the scoring, different rules can be combined, to join scientific references, i.e. the rules reinforce each other. The scores are based on expert knowledge and initial method evaluation. After the scoring, pairs of scientific references that are above a certain threshold, are clustered by means of single-linkage clustering algorithm to form connected components. The method is designed to disambiguate all the scientific references in the Patstat database. The performance evaluation of the clustering method, on a large golden set with highly cited papers, shows on average a 99% precision and a 95% recall. The method is therefore accurate but careful, i.e. it weighs precision over recall. Consequently, separate clusters of high precision are sometimes formed, when there is not enough evidence for connecting scientific references, e.g. in the case of missing year and journal information for a reference. The clusters produced by the method can be used to directly link the Patstat database with bibliographic databases as the Web of Science or Scopus.

Keywords: clustering, data cleaning, data disambiguation, data mining, patent analysis, scientometrics

Procedia PDF Downloads 194
24477 Changes in Global DNA Methylation and DNA Damage in Two Tumor Cell Lines Treated with Silver and Gold Nanoparticles

Authors: Marcin Kruszewski, Barbara Sochanowicz, Sylwia Męczyńska-Wielgosz, Maria Wojewódzka, Lucyna Kapka-Skrzypczak

Abstract:

Metallic NPs are widely used in a number of applications in industry, science and medicine. Among metallic NPs foreseen to be widely used in medicine are gold nanoparticles (AuNPs) due to their low toxicity, and silver NPs (AgNPs) due to their strong antimicrobial activity. In this study, we compared an effect of AgNPs and gold NPs (AuNPs) on the formation of DNA damage and global DNA methylation and in A2780 and 4T1 cell lines, widely used models of human ovarian carcinoma and murine mammary carcinoma, respectively. The cells were treated with AgNPs coated with citrate (AgNPs(cit) or PEG (AgNPs(PEG), or AuNPs. A global DNA methylation was investigated with ELISA, whereas the formation of DNA damage was investigated by a comet +/- FPG. AgNPs decreased global DNA methylation and increased the formation of DNA lesions in both cell lines. The effect was dependent on the type of NPs used, it's coating, and cell line used. In conclusion, the epigenetic and genotoxic effects of NPs strongly depends on NP nature and cellular context. Epigenetic changes observed upon the action of AgNPs may play a crucial role in NPs-induced changes in protein expression.

Keywords: DNA damage, gold nanoparticles, methylation, silver nanoparticles

Procedia PDF Downloads 135
24476 Business Skills Laboratory in Action: Combining a Practice Enterprise Model and an ERP-Simulation to a Comprehensive Business Learning Environment

Authors: Karoliina Nisula, Samuli Pekkola

Abstract:

Business education has been criticized for being too theoretical and distant from business life. Different types of experiential learning environments ranging from manual role-play to computer simulations and enterprise resource planning (ERP) systems have been used to introduce the realistic and practical experience into business learning. Each of these learning environments approaches business learning from a different perspective. The implementations tend to be individual exercises supplementing the traditional courses. We suggest combining them into a business skills laboratory resembling an actual workplace. In this paper, we present a concrete implementation of an ERP-supported business learning environment that is used throughout the first year undergraduate business curriculum. We validate the implementation by evaluating the learning outcomes through the different domains of Bloom’s taxonomy. We use the role-play oriented practice enterprise model as a comparison group. Our findings indicate that using the ERP simulation improves the poor and average students’ lower-level cognitive learning. On the affective domain, the ERP-simulation appears to enhance motivation to learn as well as perceived acquisition of practical hands-on skills.

Keywords: business simulations, experiential learning, ERP systems, learning environments

Procedia PDF Downloads 259
24475 Probiotics as an Alternative to Antibiotic Use in Pig Production

Authors: Z. C. Dlamini, R. L. S. Langa, A. I. Okoh, O. A. Aiyegoro

Abstract:

The indiscriminate usage of antibiotics in swine production have consequential outcomes; such as development of bacterial resistance to prophylactic antibiotics and possibility of antibiotic residues in animal products. The use of probiotics appears to be the most effective procedure with positive metabolic nutritional implications. The aim of this study was to investigate the efficacy of probiotic bacteria (Lactobacillus reuteri ZJ625, Lactobacillus reuteri VB4, Lactobacillus salivarius ZJ614 and Streptococcus salivarius NBRC13956) administered as direct-fed microorganisms in weaned piglets. 45 weaned piglets blocked by weight were dived into 5 treatments groups: diet with antibiotic, diet with no-antibiotic and no probiotic, and diet with probiotic and diet with combination of probiotics. Piglets performance was monitored during the trials. Faecal and Ileum samples were collected for microbial count analysis. Blood samples were collected from pigs at the end of the trial, for analysis of haematological, biochemical and IgG stimulation. The data was analysed by Split-Plot ANOVA using SAS statistically software (SAS 9.3) (2003). The difference was observed between treatments for daily weight and feed conversion ratio. No difference was observed in analysis of faecal samples in regards with bacterial counts, difference was observed in ileums samples with enteric bacteria colony forming unit being lower in P2 treatment group as compared with lactic acid and total bacteria. With exception of globulin and albumin, biochemistry blood parameters were not affected, likewise for haematology, only basophils and segmented neutrophils were differed by having higher concentration in NC treatment group as compared with other treatment groups. Moreover, in IgG stimulation analysis, difference was also observed, with P2 treatment group having high concentration of IgG in P2 treatment group as compared to other groups. The results of this study suggest that probiotics have a beneficial effect on growth performances, blood parameters and IgG stimulation of pigs, most effective when they are administered in synergy form. This means that it is most likely that these probiotics will offer a significant benefit in pig farming by reducing risk of morbidity and mortality and produce quality meat that is more affordable to poorer communities, and thereby enhance South African pig industry’s economy. In addition, these results indicate that there is still more research need to be done on probiotics in regards with, i.e. dosage, shelf life and mechanism of action.

Keywords: antibiotics, biochemistry, haematology, IgG-stimulation, microbial count, probiotics

Procedia PDF Downloads 301
24474 A Human Centered Design of an Exoskeleton Using Multibody Simulation

Authors: Sebastian Kölbl, Thomas Reitmaier, Mathias Hartmann

Abstract:

Trial and error approaches to adapt wearable support structures to human physiology are time consuming and elaborate. However, during preliminary design, the focus lies on understanding the interaction between exoskeleton and the human body in terms of forces and moments, namely body mechanics. For the study at hand, a multi-body simulation approach has been enhanced to evaluate actual forces and moments in a human dummy model with and without a digital mock-up of an active exoskeleton. Therefore, different motion data have been gathered and processed to perform a musculosceletal analysis. The motion data are ground reaction forces, electromyography data (EMG) and human motion data recorded with a marker-based motion capture system. Based on the experimental data, the response of the human dummy model has been calibrated. Subsequently, the scalable human dummy model, in conjunction with the motion data, is connected with the exoskeleton structure. The results of the human-machine interaction (HMI) simulation platform are in particular resulting contact forces and human joint forces to compare with admissible values with regard to the human physiology. Furthermore, it provides feedback for the sizing of the exoskeleton structure in terms of resulting interface forces (stress justification) and the effect of its compliance. A stepwise approach for the setup and validation of the modeling strategy is presented and the potential for a more time and cost-effective development of wearable support structures is outlined.

Keywords: assistive devices, ergonomic design, inverse dynamics, inverse kinematics, multibody simulation

Procedia PDF Downloads 162
24473 Evaluation of the Nursing Management Course in Undergraduate Nursing Programs of State Universities in Turkey

Authors: Oznur Ispir, Oya Celebi Cakiroglu, Esengul Elibol, Emine Ceribas, Gizem Acikgoz, Hande Yesilbas, Merve Tarhan

Abstract:

This study was conducted to evaluate the academic staff teaching the 'Nursing Management' course in the undergraduate nursing programs of the state universities in Turkey and to assess the current content of the course. Design of the study is descriptive. Population of the study consists of seventy-eight undergraduate nursing programs in the state universities in Turkey. The questionnaire/survey prepared by the researchers was used as a data collection tool. The data were obtained by screening the content of the websites of nursing education programs between March and May 2016. Descriptive statistics were used to analyze the data. The research performed within the study indicated that 58% of the undergraduate nursing programs from which the data were derived were included in the school of health, 81% of the academic staff graduated from the undergraduate nursing programs, 40% worked as a lecturer and 37% specialized in a field other than the nursing. The research also implied that the above-mentioned course was included in 98% of the programs from which it was possible to obtain data. The full name of the course was 'Nursing Management' in 95% of the programs and 98% stated that the course was compulsory. Theory and application hours were 3.13 and 2.91, respectively. Moreover, the content of the course was not shared in 65% of the programs reviewed. This study demonstrated that the experience and expertise of the academic staff teaching the 'Nursing Management' course was not sufficient in the management area, and the schedule and content of the course were not sufficient although many nursing education programs provided the course. Comparison between the curricula of the course revealed significant differences.

Keywords: nursing, nursing management, nursing management course, undergraduate program

Procedia PDF Downloads 358