Search results for: fair data principles
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26528

Search results for: fair data principles

25298 A Contrastive Analysis of English and Ukwuani Front Vowels

Authors: Omenogor, Happy Dumbi

Abstract:

This paper examines the areas of convergence and divergence between English and Ųkwųanį (a language in Nigeria) vowel systems with particular emphasis on the front vowels. It specifies areas of difficulty for the average Ųkwųanį users of English and Ųkwųanį L1 users of English as a second language. The paper explains the nature of contrastive analysis, the geographical locations where Ųkwųanį is spoken as mother tongue as well as English and Ųkwųanį front vowels. The principles of establishing phonemes, minimal pairs in Ųkwųanį as well as the vowel charts in both languages are among the issues highlighted in this paper.

Keywords: convergence, divergence, English, Ukwųanį

Procedia PDF Downloads 480
25297 The Flipped Classroom Used in Business Curricula

Authors: Hedia Mhiri Sellami

Abstract:

This case study used the principles of the flipped classroom (FC) in courses dealing with the use of the Information and Communication Technology (ICT) in three business curricula. The FC was used because our first goal is to devote more time to practice the theoretical concepts, so, before the class session, students had to watch videos introducing the concept they will learn. The videos weren't designed for our course, they are on Youtube and correspond to real cases of the ICT use in companies. This choice was also made in order to meet our second goal; it was to motivate students by showing them that the aspects covered by the course are very useful in the business. This case study reinforced the positive reputation of the FC as it was globally appreciated by our students. Beside, we managed to achieve our objectives relating to the motivation and application of concepts studied.

Keywords: flipped classroom, business, ICT, video, learning

Procedia PDF Downloads 283
25296 A Survey on Data-Centric and Data-Aware Techniques for Large Scale Infrastructures

Authors: Silvina Caíno-Lores, Jesús Carretero

Abstract:

Large scale computing infrastructures have been widely developed with the core objective of providing a suitable platform for high-performance and high-throughput computing. These systems are designed to support resource-intensive and complex applications, which can be found in many scientific and industrial areas. Currently, large scale data-intensive applications are hindered by the high latencies that result from the access to vastly distributed data. Recent works have suggested that improving data locality is key to move towards exascale infrastructures efficiently, as solutions to this problem aim to reduce the bandwidth consumed in data transfers, and the overheads that arise from them. There are several techniques that attempt to move computations closer to the data. In this survey we analyse the different mechanisms that have been proposed to provide data locality for large scale high-performance and high-throughput systems. This survey intends to assist scientific computing community in understanding the various technical aspects and strategies that have been reported in recent literature regarding data locality. As a result, we present an overview of locality-oriented techniques, which are grouped in four main categories: application development, task scheduling, in-memory computing and storage platforms. Finally, the authors include a discussion on future research lines and synergies among the former techniques.

Keywords: data locality, data-centric computing, large scale infrastructures, cloud computing

Procedia PDF Downloads 255
25295 Wind Speed Data Analysis in Colombia in 2013 and 2015

Authors: Harold P. Villota, Alejandro Osorio B.

Abstract:

The energy meteorology is an area for study energy complementarity and the use of renewable sources in interconnected systems. Due to diversify the energy matrix in Colombia with wind sources, is necessary to know the data bases about this one. However, the time series given by 260 automatic weather stations have empty, and no apply data, so the purpose is to fill the time series selecting two years to characterize, impute and use like base to complete the data between 2005 and 2020.

Keywords: complementarity, wind speed, renewable, colombia, characteri, characterization, imputation

Procedia PDF Downloads 161
25294 Industrial Process Mining Based on Data Pattern Modeling and Nonlinear Analysis

Authors: Hyun-Woo Cho

Abstract:

Unexpected events may occur with serious impacts on industrial process. This work utilizes a data representation technique to model and to analyze process data pattern for the purpose of diagnosis. In this work, the use of triangular representation of process data is evaluated using simulation process. Furthermore, the effect of using different pre-treatment techniques based on such as linear or nonlinear reduced spaces was compared. This work extracted the fault pattern in the reduced space, not in the original data space. The results have shown that the non-linear technique based diagnosis method produced more reliable results and outperforms linear method.

Keywords: process monitoring, data analysis, pattern modeling, fault, nonlinear techniques

Procedia PDF Downloads 384
25293 Protecting Labor Rights in the Platform Economy: Legal Challenges and Innovative Explorations

Authors: Ruwen Pei

Abstract:

In the rapidly evolving landscape of the digital economy, platform employment has emerged as a transformative labor force, fundamentally altering the traditional paradigms of the employer-employee relationship. This paper provides a comprehensive analysis of the unique dynamics and intricate legal challenges associated with platform work, where workers often navigate precarious labor conditions without the robust safety nets typically afforded in traditional industries. It underscores the limitations of current labor regulations, particularly in addressing pressing concerns such as income volatility and disparate benefits. By drawing insights from diverse global case studies, this study emphasizes the compelling need for platform companies to shoulder their social welfare responsibilities, ensuring fair treatment and security for their workers. Moreover, it critically examines the profound influence of socio-cultural factors and educational awareness on the platform economy, shedding light on the complexities of this emerging labor landscape. Advocating for a harmonious equilibrium between flexibility and security, this paper calls for substantial legal reforms and innovative policy initiatives that can adapt to the evolving nature of work in the digital age. Finally, it anticipates forthcoming trends in the digital economy and platform labor relations, underscoring the significance of proactive adaptation to foster equitable and inclusive employment practices.

Keywords: platform employment, labor protections, social welfare, legal reforms, digital economy

Procedia PDF Downloads 63
25292 Effects of Exhibition Firms' Resource Investment Behavior on Their Booth Staffs' Role Perceptions, Goal Acceptance and Work Effort during the Exhibition Period

Authors: Po-Chien Li

Abstract:

Despite the extant literature has hosted a wide-range of knowledge about trade shows, this knowledge base deserves to be further expanded and extended because there exist many unclear issues and overlooked topics. One area that needs much research attention is regarding the behavior and performance of booth workers at the exhibition site. Booth staffs play many key roles in interacting with booth visitors. Their exhibiting-related attitudes and motivations might have significant consequences on a firm’s exhibition results. However, to date, little research, if any, has studied how booth workers are affected and behave in the context of trade fair. The primary purpose of the current study is to develop and test a research model, derived from role theory and resource-based viewpoint, that depicts the effects of a firm’s pre-exhibition resource investment behavior on booth staff’s role perceptions and work behavior during the exhibition period. The author collects data with two survey questionnaires at two trade shows in 2016. One questionnaire is given to the booth head of an exhibiting company, asking about the firm’s resource commitment behavior prior to the exhibition period. In contrast, another questionnaire is provided for a booth worker of the same firm, requesting the individual staff to report his/her own role perceptions, degree of exhibition goal acceptance, and level of work effort during the exhibition period. The study has utilized the following analytic methods, including descriptive statistics, exploratory factor analysis, reliability analysis, and regression analysis. The results of a set of regression analyses show that a firm’s pre-exhibition resource investment behavior has significant effects on a booth staff’s exhibiting perceptions and attitudes. Specifically, an exhibitor’s resource investment behavior has impacts on the factors of booth staff’s role clarity and role conflict. In addition, a booth worker’s role clarity is related to the degree of exhibition goal acceptance, but his/her role conflict is not. Finally, a booth worker’s exhibiting effort is significantly related to the individual’s role clarity, role conflict and goal acceptance. In general, the major contribution of the current research is that it offers insight into and early evidence on the links between an exhibiting firm’s resource commitment behavior and the work perceptions and attitudes of booth staffs during the exhibition period. The current research’s results can benefit the extant literature of exhibition marketing.

Keywords: exhibition resource investment, role perceptions, goal acceptance, work effort

Procedia PDF Downloads 213
25291 Dyeing of Wool and Silk with Soxhlet Water Extracted Natural Dye from Dacryodes macrophylla Fruits and Study of Antimicrobial Properties of Extract

Authors: Alvine Sandrine Ndinchout, D. P. Chattopadhyay, Moundipa Fewou Paul, Nyegue Maximilienne Ascension, Varinder Kaur, Sukhraj Kaur, B. H. Patel

Abstract:

Dacryodes macrophylla is a species of the Burseraceae family that is widespread in Cameroon, Equatorial Guinea, and Gabon. The only part of D. macrophylla known to use is the pulp contained in the fruit. This very juicy pulp is consumed directly and used in making juices. During consumption, these fruit leaves a dark blackish colour on fingers and garment. This observation means that D. macrophylla fruits must be a good source of natural dye with probably good fastness properties on textile materials. But D. macrophylla has not yet been investigated with reference as a potential source of natural dye to our best knowledge. Natural dye has been extracted using water as solvent by soxhlet extraction method. The extracted color was characterized by spectroscopic studies like UV/Visible and further tested for antimicrobial activity against gram-negative (Vibrio cholerae, Escherichia coli, Salmonella enterica serotype Typhi, Shigella flexneri) and gram-positive (Listeria monocytogenes, Staphylococcus aureus) bacteria. It was observed that the water extract of D. macrophylla showed antimicrobial activities against S. enterica. The results of fastness properties of the dyed fabrics were fair to good. Taken together, these results indicate that D. macrophylla can be used as natural dye not only in textile but also in other domains like food coloring.

Keywords: antimicrobial activity, natural dye, silk, wash fastness, wool

Procedia PDF Downloads 173
25290 Recommender System Based on Mining Graph Databases for Data-Intensive Applications

Authors: Mostafa Gamal, Hoda K. Mohamed, Islam El-Maddah, Ali Hamdi

Abstract:

In recent years, many digital documents on the web have been created due to the rapid growth of ’social applications’ communities or ’Data-intensive applications’. The evolution of online-based multimedia data poses new challenges in storing and querying large amounts of data for online recommender systems. Graph data models have been shown to be more efficient than relational data models for processing complex data. This paper will explain the key differences between graph and relational databases, their strengths and weaknesses, and why using graph databases is the best technology for building a realtime recommendation system. Also, The paper will discuss several similarity metrics algorithms that can be used to compute a similarity score of pairs of nodes based on their neighbourhoods or their properties. Finally, the paper will discover how NLP strategies offer the premise to improve the accuracy and coverage of realtime recommendations by extracting the information from the stored unstructured knowledge, which makes up the bulk of the world’s data to enrich the graph database with this information. As the size and number of data items are increasing rapidly, the proposed system should meet current and future needs.

Keywords: graph databases, NLP, recommendation systems, similarity metrics

Procedia PDF Downloads 102
25289 Digital Revolution a Veritable Infrastructure for Technological Development

Authors: Osakwe Jude Odiakaosa

Abstract:

Today’s digital society is characterized by e-education or e-learning, e-commerce, and so on. All these have been propelled by digital revolution. Digital technology such as computer technology, Global Positioning System (GPS) and Geographic Information System (GIS) has been having a tremendous impact on the field of technology. This development has positively affected the scope, methods, speed of data acquisition, data management and the rate of delivery of the results (map and other map products) of data processing. This paper tries to address the impact of revolution brought by digital technology.

Keywords: digital revolution, internet, technology, data management

Procedia PDF Downloads 446
25288 Language on Skin Whitening Products in Pakistan Promotes Unfair Beauty Standards: A Critical Discourse Analysis

Authors: Azeem Alphonce

Abstract:

In Pakistan, there is a variety of skin tones and colors across all provinces. However, a fair complexion is one of the standards of beauty among females in Pakistan, which creates insecurities in dark-complexioned females. This research is a critical discourse analysis of the language used on beauty products for females in Pakistan. The purpose was to analyze the language used on female beauty products using Van Dijk's three-stage socio-cognitive model to understand what message is received from the few words written and repeated across the packaging of various facial products, why such language is used and what are its wider socio-cognitive effects? The criterion for the selection of beauty products was skin whitening terminologies and the language used on these products. The results showed that over 57 per cent of products utilized skin-whitening terms. The adjectives written on the package indicate that fairer skin is the ultimate beauty goal of females. The analysis explored how the language reinforces unfair beauty standards and perpetuates colorism. It was concluded that female beauty products utilize discriminatory discourse by marginalizing individuals of darker skin tones. Fairer skin is promoted, whereas darker skin is referred to as a problem, flaw or imperfection. Socially shared mental models seem to have caused beauty companies to exploit and promote perceptions of colorism in society. Therefore, such discourse should be prevented, and beauty companies should utilize their discourse to promote acceptance of various skin tones.

Keywords: language, skin whitening products, beauty standards, social mental models

Procedia PDF Downloads 66
25287 BigCrypt: A Probable Approach of Big Data Encryption to Protect Personal and Business Privacy

Authors: Abdullah Al Mamun, Talal Alkharobi

Abstract:

As data size is growing up, people are became more familiar to store big amount of secret information into cloud storage. Companies are always required to need transfer massive business files from one end to another. We are going to lose privacy if we transmit it as it is and continuing same scenario repeatedly without securing the communication mechanism means proper encryption. Although asymmetric key encryption solves the main problem of symmetric key encryption but it can only encrypt limited size of data which is inapplicable for large data encryption. In this paper we propose a probable approach of pretty good privacy for encrypt big data using both symmetric and asymmetric keys. Our goal is to achieve encrypt huge collection information and transmit it through a secure communication channel for committing the business and personal privacy. To justify our method an experimental dataset from three different platform is provided. We would like to show that our approach is working for massive size of various data efficiently and reliably.

Keywords: big data, cloud computing, cryptography, hadoop, public key

Procedia PDF Downloads 315
25286 Implementation of Big Data Concepts Led by the Business Pressures

Authors: Snezana Savoska, Blagoj Ristevski, Violeta Manevska, Zlatko Savoski, Ilija Jolevski

Abstract:

Big data is widely accepted by the pharmaceutical companies as a result of business demands create through legal pressure. Pharmaceutical companies have many legal demands as well as standards’ demands and have to adapt their procedures to the legislation. To manage with these demands, they have to standardize the usage of the current information technology and use the latest software tools. This paper highlights some important aspects of experience with big data projects implementation in a pharmaceutical Macedonian company. These projects made improvements of their business processes by the help of new software tools selected to comply with legal and business demands. They use IT as a strategic tool to obtain competitive advantage on the market and to reengineer the processes towards new Internet economy and quality demands. The company is required to manage vast amounts of structured as well as unstructured data. For these reasons, they implement projects for emerging and appropriate software tools which have to deal with big data concepts accepted in the company.

Keywords: big data, unstructured data, SAP ERP, documentum

Procedia PDF Downloads 265
25285 Saving Energy at a Wastewater Treatment Plant through Electrical and Production Data Analysis

Authors: Adriano Araujo Carvalho, Arturo Alatrista Corrales

Abstract:

This paper intends to show how electrical energy consumption and production data analysis were used to find opportunities to save energy at Taboada wastewater treatment plant in Callao, Peru. In order to access the data, it was used independent data networks for both electrical and process instruments, which were taken to analyze under an ISO 50001 energy audit, which considered, thus, Energy Performance Indexes for each process and a step-by-step guide presented in this text. Due to the use of aforementioned methodology and data mining techniques applied on information gathered through electronic multimeters (conveniently placed on substation switchboards connected to a cloud network), it was possible to identify thoroughly the performance of each process and thus, evidence saving opportunities which were previously hidden before. The data analysis brought both costs and energy reduction, allowing the plant to save significant resources and to be certified under ISO 50001.

Keywords: energy and production data analysis, energy management, ISO 50001, wastewater treatment plant energy analysis

Procedia PDF Downloads 191
25284 Data Clustering in Wireless Sensor Network Implemented on Self-Organization Feature Map (SOFM) Neural Network

Authors: Krishan Kumar, Mohit Mittal, Pramod Kumar

Abstract:

Wireless sensor network is one of the most promising communication networks for monitoring remote environmental areas. In this network, all the sensor nodes are communicated with each other via radio signals. The sensor nodes have capability of sensing, data storage and processing. The sensor nodes collect the information through neighboring nodes to particular node. The data collection and processing is done by data aggregation techniques. For the data aggregation in sensor network, clustering technique is implemented in the sensor network by implementing self-organizing feature map (SOFM) neural network. Some of the sensor nodes are selected as cluster head nodes. The information aggregated to cluster head nodes from non-cluster head nodes and then this information is transferred to base station (or sink nodes). The aim of this paper is to manage the huge amount of data with the help of SOM neural network. Clustered data is selected to transfer to base station instead of whole information aggregated at cluster head nodes. This reduces the battery consumption over the huge data management. The network lifetime is enhanced at a greater extent.

Keywords: artificial neural network, data clustering, self organization feature map, wireless sensor network

Procedia PDF Downloads 513
25283 Review and Comparison of Associative Classification Data Mining Approaches

Authors: Suzan Wedyan

Abstract:

Data mining is one of the main phases in the Knowledge Discovery Database (KDD) which is responsible of finding hidden and useful knowledge from databases. There are many different tasks for data mining including regression, pattern recognition, clustering, classification, and association rule. In recent years a promising data mining approach called associative classification (AC) has been proposed, AC integrates classification and association rule discovery to build classification models (classifiers). This paper surveys and critically compares several AC algorithms with reference of the different procedures are used in each algorithm, such as rule learning, rule sorting, rule pruning, classifier building, and class allocation for test cases.

Keywords: associative classification, classification, data mining, learning, rule ranking, rule pruning, prediction

Procedia PDF Downloads 532
25282 Hierarchical Checkpoint Protocol in Data Grids

Authors: Rahma Souli-Jbali, Minyar Sassi Hidri, Rahma Ben Ayed

Abstract:

Grid of computing nodes has emerged as a representative means of connecting distributed computers or resources scattered all over the world for the purpose of computing and distributed storage. Since fault tolerance becomes complex due to the availability of resources in decentralized grid environment, it can be used in connection with replication in data grids. The objective of our work is to present fault tolerance in data grids with data replication-driven model based on clustering. The performance of the protocol is evaluated with Omnet++ simulator. The computational results show the efficiency of our protocol in terms of recovery time and the number of process in rollbacks.

Keywords: data grids, fault tolerance, clustering, chandy-lamport

Procedia PDF Downloads 334
25281 An Observation of the Information Technology Research and Development Based on Article Data Mining: A Survey Study on Science Direct

Authors: Muhammet Dursun Kaya, Hasan Asil

Abstract:

One of the most important factors of research and development is the deep insight into the evolutions of scientific development. The state-of-the-art tools and instruments can considerably assist the researchers, and many of the world organizations have become aware of the advantages of data mining for the acquisition of the knowledge required for the unstructured data. This paper was an attempt to review the articles on the information technology published in the past five years with the aid of data mining. A clustering approach was used to study these articles, and the research results revealed that three topics, namely health, innovation, and information systems, have captured the special attention of the researchers.

Keywords: information technology, data mining, scientific development, clustering

Procedia PDF Downloads 274
25280 Security in Resource Constraints: Network Energy Efficient Encryption

Authors: Mona Almansoori, Ahmed Mustafa, Ahmad Elshamy

Abstract:

Wireless nodes in a sensor network gather and process critical information designed to process and communicate, information flooding through such network is critical for decision making and data processing, the integrity of such data is one of the most critical factors in wireless security without compromising the processing and transmission capability of the network. This paper presents mechanism to securely transmit data over a chain of sensor nodes without compromising the throughput of the network utilizing available battery resources available at the sensor node.

Keywords: hybrid protocol, data integrity, lightweight encryption, neighbor based key sharing, sensor node data processing, Z-MAC

Procedia PDF Downloads 142
25279 Data Mining Techniques for Anti-Money Laundering

Authors: M. Sai Veerendra

Abstract:

Today, money laundering (ML) poses a serious threat not only to financial institutions but also to the nation. This criminal activity is becoming more and more sophisticated and seems to have moved from the cliché of drug trafficking to financing terrorism and surely not forgetting personal gain. Most of the financial institutions internationally have been implementing anti-money laundering solutions (AML) to fight investment fraud activities. However, traditional investigative techniques consume numerous man-hours. Recently, data mining approaches have been developed and are considered as well-suited techniques for detecting ML activities. Within the scope of a collaboration project on developing a new data mining solution for AML Units in an international investment bank in Ireland, we survey recent data mining approaches for AML. In this paper, we present not only these approaches but also give an overview on the important factors in building data mining solutions for AML activities.

Keywords: data mining, clustering, money laundering, anti-money laundering solutions

Procedia PDF Downloads 531
25278 Emulation Model in Architectural Education

Authors: Ö. Şenyiğit, A. Çolak

Abstract:

It is of great importance for an architectural student to know the parameters through which he/she can conduct his/her design and makes his/her design effective in architectural education. Therefore; an empirical application study was carried out through the designing activity using the emulation model to support the design and design approaches of architectural students. During the investigation period, studies were done on the basic design elements and principles of the fall semester, and the emulation model, one of the designing methods that constitute the subject of the study, was fictionalized as three phased “recognition-interpretation-application”. As a result of the study, it was observed that when students were given a key method during the design process, their awareness increased and their aspects improved as well.

Keywords: basic design, design education, design methods, emulation

Procedia PDF Downloads 231
25277 The Dark Side of the Fight against Organised Crime

Authors: Ana M. Prieto del Pino

Abstract:

As is well known, UN Convention against Illicit Traffic in Narcotic Drugs and Psychotropic Substances (1988) was a landmark regarding the seizure of proceeds of crime. Depriving criminals of the profits from their activity became a priority at an international level in the fight against organised crime. Enabling confiscation of proceeds of illicit traffic in narcotic drugs and psychotropic substances, criminalising money laundering and confiscating the proceeds thereof are the three measures taken in order to achieve that purpose. The beginning of 21st century brought the declaration of war on corruption and on the illicit enjoyment of the profits thereof onto the international scene. According to the UN Convention against Transnational Organised Crime (2000), States Parties should adopt the necessary measures to enable the confiscation of proceeds of crime derived from offences (or property of equivalent value) and property, equipment and other instrumentalities used in offences covered by that Convention. The UN Convention against Corruption (2003) states asset recovery explicitly as a fundamental principle and sets forth measures aiming at the direct recovery of property through international cooperation in confiscation. Furthermore, European legislation has made many significant strides forward in less than twenty years concerning money laundering, confiscation, and asset recovery. Crime does not pay, let there be no doubt about it. Nevertheless, we must be very careful not to sing out of tune with individual rights and legal guarantees. On the one hand, innocent individuals and businesses must be protected, since they should not pay for the guilty ones’ faults. On the other hand, the rule of law must be preserved and not be tossed aside regarding those who have carried out criminal activities. An in-depth analysis of judicial decisions on money laundering and confiscation of proceeds of crime issued by European national courts and by the European Court of Human Rights in the last decade has been carried out from a human rights, legal guarantees and criminal law basic principles’ perspective. The undertaken study has revealed the violation of the right to property, of the proportionality principle legal and the infringement of basic principles of states’ domestic substantive and procedural criminal law systems. The most relevant ones have to do with the punishment of money laundering committed through negligence, non-conviction based confiscation and a too-far reaching interpretation of the notion of ‘proceeds of crime’. Almost everything in life has a bright and a dark side. Confiscation of criminal proceeds and asset recovery are not an exception to this rule.

Keywords: confiscation, human rights, money laundering, organized crime

Procedia PDF Downloads 138
25276 Technological Affordances: Guidelines for E-Learning Design

Authors: Clement Chimezie Aladi, Itamar Shabtai

Abstract:

A review of the literature in the last few years reveals that little attention has been paid to technological affordances in e-learning designs. However, affordances are key to engaging students and enabling teachers to actualize learning goals. E-learning systems (software and artifacts) need to be designed in such a way that the features facilitate perceptions of the affordances with minimal cognition. This study aimed to fill this gap in the literature and encourage further research in this area. It provides guidelines for facilitating the perception of affordances in e-learning design and advances Technology Affordance and Constraints Theory by incorporating the affordance-based design process, the principles of multimedia learning, e-learning design philosophy, and emotional and cognitive affordances.

Keywords: e-learning, technology affrodances, affordance based design, e-learning design

Procedia PDF Downloads 56
25275 Experimental Investigation of Heat Pipe with Annular Fins under Natural Convection at Different Inclinations

Authors: Gangacharyulu Dasaroju, Sumeet Sharma, Sanjay Singh

Abstract:

Heat pipe is characterised as superconductor of heat because of its excellent heat removal ability. The operation of several engineering system results in generation of heat. This may cause several overheating problems and lead to failure of the systems. To overcome this problem and to achieve desired rate of heat dissipation, there is need to study the performance of heat pipe with annular fins under free convection at different inclinations. This study demonstrates the effect of different mass flow rate of hot fluid into evaporator section on the condenser side heat transfer coefficient with annular fins under natural convection at different inclinations. In this study annular fins are used for the experimental work having dimensions of length of fin, thickness of fin and spacing of fin as 10 mm, 1 mm and 6 mm, respectively. The main aim of present study is to discover at what inclination angles the maximum heat transfer coefficient shall be achieved. The heat transfer coefficient on the external surface of heat pipe condenser section is determined by experimental method and then predicted by empirical correlations. The results obtained from experimental and Churchill and Chu relation for laminar are in fair agreement with not more than 22% deviation. It is elucidated the maximum heat transfer coefficient of 31.2 W/(m2-K) at 25˚ tilt angle and minimal condenser heat transfer coefficient of 26.4 W/(m2-K) is seen at 45˚ tilt angle and 200 ml/min mass flow rate. Inclination angle also affects the thermal performance of heat pipe. Beyond 25o inclination, heat transport rate starts to decrease.

Keywords: heat pipe, annular fins, natural convection, condenser heat transfer coefficient, tilt angle

Procedia PDF Downloads 149
25274 Development of New Technology Evaluation Model by Using Patent Information and Customers' Review Data

Authors: Kisik Song, Kyuwoong Kim, Sungjoo Lee

Abstract:

Many global firms and corporations derive new technology and opportunity by identifying vacant technology from patent analysis. However, previous studies failed to focus on technologies that promised continuous growth in industrial fields. Most studies that derive new technology opportunities do not test practical effectiveness. Since previous studies depended on expert judgment, it became costly and time-consuming to evaluate new technologies based on patent analysis. Therefore, research suggests a quantitative and systematic approach to technology evaluation indicators by using patent data to and from customer communities. The first step involves collecting two types of data. The data is used to construct evaluation indicators and apply these indicators to the evaluation of new technologies. This type of data mining allows a new method of technology evaluation and better predictor of how new technologies are adopted.

Keywords: data mining, evaluating new technology, technology opportunity, patent analysis

Procedia PDF Downloads 369
25273 Explaining Irregularity in Music by Entropy and Information Content

Authors: Lorena Mihelac, Janez Povh

Abstract:

In 2017, we conducted a research study using data consisting of 160 musical excerpts from different musical styles, to analyze the impact of entropy of the harmony on the acceptability of music. In measuring the entropy of harmony, we were interested in unigrams (individual chords in the harmonic progression) and bigrams (the connection of two adjacent chords). In this study, it has been found that 53 musical excerpts out from 160 were evaluated by participants as very complex, although the entropy of the harmonic progression (unigrams and bigrams) was calculated as low. We have explained this by particularities of chord progression, which impact the listener's feeling of complexity and acceptability. We have evaluated the same data twice with new participants in 2018 and with the same participants for the third time in 2019. These three evaluations have shown that the same 53 musical excerpts, found to be difficult and complex in the study conducted in 2017, are exhibiting a high feeling of complexity again. It was proposed that the content of these musical excerpts, defined as “irregular,” is not meeting the listener's expectancy and the basic perceptual principles, creating a higher feeling of difficulty and complexity. As the “irregularities” in these 53 musical excerpts seem to be perceived by the participants without being aware of it, affecting the pleasantness and the feeling of complexity, they have been defined as “subliminal irregularities” and the 53 musical excerpts as “irregular.” In our recent study (2019) of the same data (used in previous research works), we have proposed a new measure of the complexity of harmony, “regularity,” based on the irregularities in the harmonic progression and other plausible particularities in the musical structure found in previous studies. We have in this study also proposed a list of 10 different particularities for which we were assuming that they are impacting the participant’s perception of complexity in harmony. These ten particularities have been tested in this paper, by extending the analysis in our 53 irregular musical excerpts from harmony to melody. In the examining of melody, we have used the computational model “Information Dynamics of Music” (IDyOM) and two information-theoretic measures: entropy - the uncertainty of the prediction before the next event is heard, and information content - the unexpectedness of an event in a sequence. In order to describe the features of melody in these musical examples, we have used four different viewpoints: pitch, interval, duration, scale degree. The results have shown that the texture of melody (e.g., multiple voices, homorhythmic structure) and structure of melody (e.g., huge interval leaps, syncopated rhythm, implied harmony in compound melodies) in these musical excerpts are impacting the participant’s perception of complexity. High information content values were found in compound melodies in which implied harmonies seem to have suggested additional harmonies, affecting the participant’s perception of the chord progression in harmony by creating a sense of an ambiguous musical structure.

Keywords: entropy and information content, harmony, subliminal (ir)regularity, IDyOM

Procedia PDF Downloads 128
25272 Anomaly Detection Based on System Log Data

Authors: M. Kamel, A. Hoayek, M. Batton-Hubert

Abstract:

With the increase of network virtualization and the disparity of vendors, the continuous monitoring and detection of anomalies cannot rely on static rules. An advanced analytical methodology is needed to discriminate between ordinary events and unusual anomalies. In this paper, we focus on log data (textual data), which is a crucial source of information for network performance. Then, we introduce an algorithm used as a pipeline to help with the pretreatment of such data, group it into patterns, and dynamically label each pattern as an anomaly or not. Such tools will provide users and experts with continuous real-time logs monitoring capability to detect anomalies and failures in the underlying system that can affect performance. An application of real-world data illustrates the algorithm.

Keywords: logs, anomaly detection, ML, scoring, NLP

Procedia PDF Downloads 91
25271 Cantilever Secant Pile Constructed in Sand: Numerical Comparative Study and Design Aids – Part II

Authors: Khaled R. Khater

Abstract:

All civil engineering projects include excavation work and therefore need some retaining structures. Cantilever secant pile walls are an economical supporting system up to 5.0-m depths. The parameters controlling wall tip displacement are the focus of this paper. So, two analysis techniques have been investigated and arbitrated. They are the conventional method and finite element analysis. Accordingly, two computer programs have been used, Excel sheet and Plaxis-2D. Two soil models have been used throughout this study. They are Mohr-Coulomb soil model and Isotropic Hardening soil models. During this study, two soil densities have been considered, i.e. loose and dense sand. Ten wall rigidities have been analyzed covering ranges of perfectly flexible to completely rigid walls. Three excavation depths, i.e. 3.0-m, 4.0-m and 5.0-m were tested to cover the practical range of secant piles. This work submits beneficial hints about secant piles to assist designers and specification committees. Also, finite element analysis, isotropic hardening, is recommended to be the fair judge when two designs conflict. A rational procedure using empirical equations has been suggested to upgrade the conventional method to predict wall tip displacement ‘δ’. Also, a reasonable limitation of ‘δ’ as a function of excavation depth, ‘h’ has been suggested. Also, it has been found that, after a certain penetration depth any further increase of it does not positively affect the wall tip displacement, i.e. over design and uneconomic.

Keywords: design aids, numerical analysis, secant pile, Wall tip displacement

Procedia PDF Downloads 186
25270 EnumTree: An Enumerative Biclustering Algorithm for DNA Microarray Data

Authors: Haifa Ben Saber, Mourad Elloumi

Abstract:

In a number of domains, like in DNA microarray data analysis, we need to cluster simultaneously rows (genes) and columns (conditions) of a data matrix to identify groups of constant rows with a group of columns. This kind of clustering is called biclustering. Biclustering algorithms are extensively used in DNA microarray data analysis. More effective biclustering algorithms are highly desirable and needed. We introduce a new algorithm called, Enumerative tree (EnumTree) for biclustering of binary microarray data. is an algorithm adopting the approach of enumerating biclusters. This algorithm extracts all biclusters consistent good quality. The main idea of ​​EnumLat is the construction of a new tree structure to represent adequately different biclusters discovered during the process of enumeration. This algorithm adopts the strategy of all biclusters at a time. The performance of the proposed algorithm is assessed using both synthetic and real DNA micryarray data, our algorithm outperforms other biclustering algorithms for binary microarray data. Biclusters with different numbers of rows. Moreover, we test the biological significance using a gene annotation web tool to show that our proposed method is able to produce biologically relevent biclusters.

Keywords: DNA microarray, biclustering, gene expression data, tree, datamining.

Procedia PDF Downloads 369
25269 The Impact of Financial Reporting on Sustainability

Authors: Lynn Ruggieri

Abstract:

The worldwide pandemic has only increased sustainability awareness. The public is demanding that businesses be held accountable for their impact on the environment. While financial data enjoys uniformity in reporting requirements, there are no uniform reporting requirements for non-financial data. Europe is leading the way with some standards being implemented for reporting non-financial sustainability data; however, there is no uniformity globally. And without uniformity, there is not a clear understanding of what information to include and how to disclose it. Sustainability reporting will provide important information to stakeholders and will enable businesses to understand their impact on the environment. Therefore, there is a crucial need for this data. This paper looks at the history of sustainability reporting in the countries of the European Union and throughout the world and makes a case for worldwide reporting requirements for sustainability.

Keywords: financial reporting, non-financial data, sustainability, global financial reporting

Procedia PDF Downloads 174