Search results for: scientific data mining
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 26171

Search results for: scientific data mining

24731 Big Data Analytics and Data Security in the Cloud via Fully Homomorphic Encyption Scheme

Authors: Victor Onomza Waziri, John K. Alhassan, Idris Ismaila, Noel Dogonyara

Abstract:

This paper describes the problem of building secure computational services for encrypted information in the Cloud. Computing without decrypting the encrypted data; therefore, it meets the yearning of computational encryption algorithmic aspiration model that could enhance the security of big data for privacy or confidentiality, availability and integrity of the data and user’s security. The cryptographic model applied for the computational process of the encrypted data is the Fully Homomorphic Encryption Scheme. We contribute a theoretical presentations in a high-level computational processes that are based on number theory that is derivable from abstract algebra which can easily be integrated and leveraged in the Cloud computing interface with detail theoretic mathematical concepts to the fully homomorphic encryption models. This contribution enhances the full implementation of big data analytics based on cryptographic security algorithm.

Keywords: big data analytics, security, privacy, bootstrapping, Fully Homomorphic Encryption Scheme

Procedia PDF Downloads 461
24730 Assessment Proposal to Establish the First Geo-Park in Egypt at Abu-Roash Area, Cairo

Authors: Kholoud Abdelmaksoud, Mahmoud Emam, Wael Al-Metwaly

Abstract:

Egypt is known as cradle of civilization due to its ancient history and archeological sites, but Egypt possess also a cradle of Geo-sites, which qualify it to be listed as one of the most important Geo-heritage sites all over the country. Geology and landscape in Abu-Roash area is considered as one of the most important geological places (geo-sites) inside Cairo which help us to know and understand geology and geologic processes, so the area is used mainly for geological education purposes, also the area contain an archeological sites; pyramid complex, tombs, and Coptic monastery which give the area unique importance. Abu-Roash area is located inside Cairo 9 km north of the Giza Pyramids, which make the accessibility to the area easy and safe, the geology of Abu-Roash constitutes a complex Cretaceous sedimentary succession mass with showing outstanding tectonic features (Syrian Arc system event), these features are considered as a Geo-heritage, which will be the main designation of ‘Geo-parks’ establishing. The research is dealing with the numerous geo-sites found in the area, and its geologic and archeological importance, the relation between geo-sites and archeology, also the research proposed a detailed maps for these sites depicting Geo-routes and the hazardous places surrounding Abu-Roash area. The research is proposing a new proposal not applied in Egypt before, establishing a Geo-park, to promote this unique geo-heritage from hazardous factors and anthropogenic effects, also it will offer geo-educational opportunities to the general public and to the scientific community, enhancement of Geo-tourism which will be linked easily with the Ancient Egyptian tourism, it will also provide a significant economic benefit to Abu-Roash residential area. Finally, the research recommends that The United Nations Educational, Scientific and Cultural Organizations promote conservation of geological and geo-morphological heritage to list this area for its importance under the umbrella of geo-parks.

Keywords: geo-park, geo-sites, Abu-roash, archaeological sites, geo-tourism

Procedia PDF Downloads 289
24729 A Qualitative Research of Online Fraud Decision-Making Process

Authors: Semire Yekta

Abstract:

Many online retailers set up manual review teams to overcome the limitations of automated online fraud detection systems. This study critically examines the strategies they adapt in their decision-making process to set apart fraudulent individuals from non-fraudulent online shoppers. The study uses a mix method research approach. 32 in-depth interviews have been conducted alongside with participant observation and auto-ethnography. The study found out that all steps of the decision-making process are significantly affected by a level of subjectivity, personal understandings of online fraud, preferences and judgments and not necessarily by objectively identifiable facts. Rather clearly knowing who the fraudulent individuals are, the team members have to predict whether they think the customer might be a fraudster. Common strategies used are relying on the classification and fraud scorings in the automated fraud detection systems, weighing up arguments for and against the customer and making a decision, using cancellation to test customers’ reaction and making use of personal experiences and “the sixth sense”. The interaction in the team also plays a significant role given that some decisions turn into a group discussion. While customer data represent the basis for the decision-making, fraud management teams frequently make use of Google search and Google Maps to find out additional information about the customer and verify whether the customer is the person they claim to be. While this, on the one hand, raises ethical concerns, on the other hand, Google Street View on the address and area of the customer puts customers living in less privileged housing and areas at a higher risk of being classified as fraudsters. Phone validation is used as a final measurement to make decisions for or against the customer when previous strategies and Google Search do not suffice. However, phone validation is also characterized by individuals’ subjectivity, personal views and judgment on customer’s reaction on the phone that results in a final classification as genuine or fraudulent.

Keywords: online fraud, data mining, manual review, social construction

Procedia PDF Downloads 331
24728 Chemometric Regression Analysis of Radical Scavenging Ability of Kombucha Fermented Kefir-Like Products

Authors: Strahinja Kovacevic, Milica Karadzic Banjac, Jasmina Vitas, Stefan Vukmanovic, Radomir Malbasa, Lidija Jevric, Sanja Podunavac-Kuzmanovic

Abstract:

The present study deals with chemometric regression analysis of quality parameters and the radical scavenging ability of kombucha fermented kefir-like products obtained with winter savory (WS), peppermint (P), stinging nettle (SN) and wild thyme tea (WT) kombucha inoculums. Each analyzed sample was described by milk fat content (MF, %), total unsaturated fatty acids content (TUFA, %), monounsaturated fatty acids content (MUFA, %), polyunsaturated fatty acids content (PUFA, %), the ability of free radicals scavenging (RSA Dₚₚₕ, % and RSA.ₒₕ, %) and pH values measured after each hour from the start until the end of fermentation. The aim of the conducted regression analysis was to establish chemometric models which can predict the radical scavenging ability (RSA Dₚₚₕ, % and RSA.ₒₕ, %) of the samples by correlating it with the MF, TUFA, MUFA, PUFA and the pH value at the beginning, in the middle and at the end of fermentation process which lasted between 11 and 17 hours, until pH value of 4.5 was reached. The analysis was carried out applying univariate linear (ULR) and multiple linear regression (MLR) methods on the raw data and the data standardized by the min-max normalization method. The obtained models were characterized by very limited prediction power (poor cross-validation parameters) and weak statistical characteristics. Based on the conducted analysis it can be concluded that the resulting radical scavenging ability cannot be precisely predicted only on the basis of MF, TUFA, MUFA, PUFA content, and pH values, however, other quality parameters should be considered and included in the further modeling. This study is based upon work from project: Kombucha beverages production using alternative substrates from the territory of the Autonomous Province of Vojvodina, 142-451-2400/2019-03, supported by Provincial Secretariat for Higher Education and Scientific Research of AP Vojvodina.

Keywords: chemometrics, regression analysis, kombucha, quality control

Procedia PDF Downloads 124
24727 Bibliometric Analysis of Global Research Trends on Organization Culture, Strategic Leadership and Performance Using Scopus Database

Authors: Anyia Nduka, Aslan Bin Amad Senin

Abstract:

Taking a behavioral perspective of Organization Culture, Strategic Leadership, and performance (OC, SLP). We examine the role of Strategic Leadership as key vicious mechanism linking OC,SLP to the organizational capacities. Given the increasing degree of dependence of modern businesses on the use and scientific discovery of relevant data, research efforts around the entire globe have been accelerated. In today's corporate world, Strategic Leadership is still the most sustainable option of performance and competitive advantage. This is why it is critical to gain a deep understanding of research area and to strengthen new collaborative networks in efforts to support research transition towards these integrative efforts. This bibliometric analysis is aimed to examine global trends in OC,SLP research based on publication output, author co-authorships, and co-occurrences of author keywords among authors and affiliated countries. 2829 journal articles were retrieved from the Scopus database Between 1974 and 2021. From the research findings, there is a significant increase in number of publications with strong global collaboration (e.g., USA & UK). We also discovered that while most countries/territories without affiliations were centered in developing countries, the outstanding performance of Asian countries and the volume of their collaborations should be emulated.

Keywords: organizational culture, strategic leadership, organizational resilience, performance

Procedia PDF Downloads 58
24726 An Approximation of Daily Rainfall by Using a Pixel Value Data Approach

Authors: Sarisa Pinkham, Kanyarat Bussaban

Abstract:

The research aims to approximate the amount of daily rainfall by using a pixel value data approach. The daily rainfall maps from the Thailand Meteorological Department in period of time from January to December 2013 were the data used in this study. The results showed that this approach can approximate the amount of daily rainfall with RMSE=3.343.

Keywords: daily rainfall, image processing, approximation, pixel value data

Procedia PDF Downloads 372
24725 Reliability Analysis of a Life Support System in a Public Aquarium

Authors: Mehmet Savsar

Abstract:

Complex Life Support Systems (LSS) are used in all large commercial and public aquariums in order to keep the fish alive. Reliabilities of individual equipment, as well as the complete system, are extremely important and critical since the life and safety of important fish depend on these life support systems. Failure of some critical device or equipment, which do not have redundancy, results in negative consequences and affects life support as a whole. In this paper, we have considered a life support system in a large public aquarium in Kuwait Scientific Center and presented a procedure and analysis to show how the reliability of such systems can be estimated by using appropriate tools and collected data. We have also proposed possible improvements for systems reliability. In particular, addition of parallel components and spare parts are considered and the numbers of spare parts needed for each component to achieve a required reliability during specified lead time are calculated. The results show that significant improvements in system reliability can be achieved by operating some LSS components in parallel and having certain numbers of spares available in the spare parts inventories. The procedures and the results presented in this paper are expected to be useful for aquarium engineers and maintenance managers dealing with LSS.

Keywords: life support systems, aquariums, reliability, failures, availability, spare parts

Procedia PDF Downloads 267
24724 Complementing Assessment Processes with Standardized Tests: A Work in Progress

Authors: Amparo Camacho

Abstract:

ABET accredited programs must assess the development of student learning outcomes (SOs) in engineering programs. Different institutions implement different strategies for this assessment, and they are usually designed “in house.” This paper presents a proposal for including standardized tests to complement the ABET assessment model in an engineering college made up of six distinct engineering programs. The engineering college formulated a model of quality assurance in education to be implemented throughout the six engineering programs to regularly assess and evaluate the achievement of SOs in each program offered. The model uses diverse techniques and sources of data to assess student performance and to implement actions of improvement based on the results of this assessment. The model is called “Assessment Process Model” and it includes SOs A through K, as defined by ABET. SOs can be divided into two categories: “hard skills” and “professional skills” (soft skills). The first includes abilities, such as: applying knowledge of mathematics, science, and engineering and designing and conducting experiments, as well as analyzing and interpreting data. The second category, “professional skills”, includes communicating effectively, and understanding professional and ethnical responsibility. Within the Assessment Process Model, various tools were used to assess SOs, related to both “hard” as well as “soft” skills. The assessment tools designed included: rubrics, surveys, questionnaires, and portfolios. In addition to these instruments, the Engineering College decided to use tools that systematically gather consistent quantitative data. For this reason, an in-house exam was designed and implemented, based on the curriculum of each program. Even though this exam was administered during various academic periods, it is not currently considered standardized. In 2017, the Engineering College included three standardized tests: one to assess mathematical and scientific reasoning and two more to assess reading and writing abilities. With these exams, the college hopes to obtain complementary information that can help better measure the development of both hard and soft skills of students in the different engineering programs. In the first semester of 2017, the three exams were given to three sample groups of students from the six different engineering programs. Students in the sample groups were either from the first, fifth, and tenth semester cohorts. At the time of submission of this paper, the engineering college has descriptive statistical data and is working with various statisticians to have a more in-depth and detailed analysis of the sample group of students’ achievement on the three exams. The overall objective of including standardized exams in the assessment model is to identify more precisely the least developed SOs in order to define and implement educational strategies necessary for students to achieve them in each engineering program.

Keywords: assessment, hard skills, soft skills, standardized tests

Procedia PDF Downloads 268
24723 Selecting The Contractor using Multi Criteria Decision Making in National Gas Company of Lorestan Province of Iran

Authors: Fatemeh Jaferi, Moslem Parsa, Heshmatolah Shams Khorramabadi

Abstract:

In this modern fluctuating world, organizations need to outsource some parts of their activities (project) to providers in order to show a quick response to their changing requirements. In fact, a number of companies and institutes have contractors do their projects and have some specific criteria in contractor selection. Therefore, a set of scientific tools is needed to select the best contractors to execute the project according to appropriate criteria. Multi-criteria decision making (MCDM) has been employed in the present study as a powerful tool in ranking and selecting the appropriate contractor. In this study, devolving second-source (civil) project to contractors in the National Gas Company of Lorestan Province (Iran) has been found and therefore, 5 civil companies have been evaluated. Evaluation criteria include executive experience, qualification of technical staff, good experience and company's rate, technical interview, affordability, equipment and machinery. Criteria's weights are found through experts' opinions along with AHP and contractors ranked through TOPSIS and AHP. The order of ranking contractors based on MCDM methods differs by changing the formula in the study. In the next phase, the number of criteria and their weights has been sensitivity analysed through using AHP. Adding each criterion changed contractors' ranking. Similarly, changing weights resulted in a change in ranking. Adopting the stated strategy resulted in the facts that not only is an appropriate scientific method available to select the most qualified contractors to execute gas project, but also a great attention is paid to picking needed criteria for selecting contractors. Consequently, executing such project is undertaken by most qualified contractors resulted in optimum use of limited resource, accelerating the implementation of project, increasing quality and finally boosting organizational efficiency.

Keywords: multi-criteria decision making, project, management, contractor selection, gas company

Procedia PDF Downloads 382
24722 The Effect of Measurement Distribution on System Identification and Detection of Behavior of Nonlinearities of Data

Authors: Mohammad Javad Mollakazemi, Farhad Asadi, Aref Ghafouri

Abstract:

In this paper, we considered and applied parametric modeling for some experimental data of dynamical system. In this study, we investigated the different distribution of output measurement from some dynamical systems. Also, with variance processing in experimental data we obtained the region of nonlinearity in experimental data and then identification of output section is applied in different situation and data distribution. Finally, the effect of the spanning the measurement such as variance to identification and limitation of this approach is explained.

Keywords: Gaussian process, nonlinearity distribution, particle filter, system identification

Procedia PDF Downloads 495
24721 Improving Efficiency of Organizational Performance: The Role of Human Resources in Supply Chains and Job Rotation Practice

Authors: Moh'd Anwer Al-Shboul

Abstract:

Jordan Customs (JC) has been established to achieve objectives that must be consistent with the guidance of the wise leadership and its aspirations toward tomorrow. Therefore, it has developed several needed tools to provide a distinguished service to simplify work procedures and used modern technologies. A supply chain (SC) consists of all parties that are involved directly or indirectly in order to fulfill a customer request, which includes manufacturers, suppliers, shippers, retailers and even customer brokers. Within each firm, the SC includes all functions involved in receiving a filling a customers’ requests; one of the main functions include customer service. JC and global SCs are evolving into dynamic environment, which requires flexibility, effective communication, and team management. Thus, human resources (HRs) insight in these areas are critical for the effective development of global process network. The importance of HRs has increased significantly due to the role of employees depends on their knowledge, competencies, abilities, skills, and motivations. Strategic planning in JC began at the end of the 1990’s including operational strategy for Human Resource Management and Development (HRM&D). However, a huge transformation in human resources happened at the end of 2006; new employees’ regulation for customs were prepared, approved and applied at the end of 2007. Therefore, many employees lost their positions, while others were selected based on professorial recruitment and selection process (enter new blood). One of several policies that were applied by human resources in JC department is job rotation. From the researcher’s point of view, it was not based on scientific basis to achieve its goals and objectives, which at the end leads to having a significant negative impact on the Organizational Performance (OP) and weak job rotation approach. The purpose of this study is to call attention to re-review the applying process and procedure of job rotation that HRM directorate is currently applied at JC. Furthermore, it presents an overview of managing the HRs in the SC network that affects their success. The research methodology employed in this study was described as qualitative by conducting few interviews with managers, internal employee, external clients and reviewing the related literature to collect some qualitative data from secondary sources. Thus, conducting frequently and unstructured job rotation policy (i.e. monthly) will have a significant negative impact on JC performance as a whole. The results of this study show that the main impacts will affect on three main elements in JC: (1) internal employees' performance; (2) external clients, who are dealing with customs services; and finally, JC performance as a whole. In order to implement a successful and perfect job rotation technique at JC in a scientific way and to achieve its goals and objectives; JCs should be taken into consideration the proposed solutions and recommendations that will be presented in this study.

Keywords: efficiency, supply chain, human resources, job rotation, organizational performance, Jordan customs

Procedia PDF Downloads 198
24720 Building a Scalable Telemetry Based Multiclass Predictive Maintenance Model in R

Authors: Jaya Mathew

Abstract:

Many organizations are faced with the challenge of how to analyze and build Machine Learning models using their sensitive telemetry data. In this paper, we discuss how users can leverage the power of R without having to move their big data around as well as a cloud based solution for organizations willing to host their data in the cloud. By using ScaleR technology to benefit from parallelization and remote computing or R Services on premise or in the cloud, users can leverage the power of R at scale without having to move their data around.

Keywords: predictive maintenance, machine learning, big data, cloud based, on premise solution, R

Procedia PDF Downloads 358
24719 Correlation between Funding and Publications: A Pre-Step towards Future Research Prediction

Authors: Ning Kang, Marius Doornenbal

Abstract:

Funding is a very important – if not crucial – resource for research projects. Usually, funding organizations will publish a description of the funded research to describe the scope of the funding award. Logically, we would expect research outcomes to align with this funding award. For that reason, we might be able to predict future research topics based on present funding award data. That said, it remains to be shown if and how future research topics can be predicted by using the funding information. In this paper, we extract funding project information and their generated paper abstracts from the Gateway to Research database as a group, and use the papers from the same domains and publication years in the Scopus database as a baseline comparison group. We annotate both the project awards and the papers resulting from the funded projects with linguistic features (noun phrases), and then calculate tf-idf and cosine similarity between these two set of features. We show that the cosine similarity between the project-generated papers group is bigger than the project-baseline group, and also that these two groups of similarities are significantly different. Based on this result, we conclude that the funding information actually correlates with the content of future research output for the funded project on the topical level. How funding really changes the course of science or of scientific careers remains an elusive question.

Keywords: natural language processing, noun phrase, tf-idf, cosine similarity

Procedia PDF Downloads 229
24718 Wireless Transmission of Big Data Using Novel Secure Algorithm

Authors: K. Thiagarajan, K. Saranya, A. Veeraiah, B. Sudha

Abstract:

This paper presents a novel algorithm for secure, reliable and flexible transmission of big data in two hop wireless networks using cooperative jamming scheme. Two hop wireless networks consist of source, relay and destination nodes. Big data has to transmit from source to relay and from relay to destination by deploying security in physical layer. Cooperative jamming scheme determines transmission of big data in more secure manner by protecting it from eavesdroppers and malicious nodes of unknown location. The novel algorithm that ensures secure and energy balance transmission of big data, includes selection of data transmitting region, segmenting the selected region, determining probability ratio for each node (capture node, non-capture and eavesdropper node) in every segment, evaluating the probability using binary based evaluation. If it is secure transmission resume with the two- hop transmission of big data, otherwise prevent the attackers by cooperative jamming scheme and transmit the data in two-hop transmission.

Keywords: big data, two-hop transmission, physical layer wireless security, cooperative jamming, energy balance

Procedia PDF Downloads 468
24717 One Step Further: Pull-Process-Push Data Processing

Authors: Romeo Botes, Imelda Smit

Abstract:

In today’s modern age of technology vast amounts of data needs to be processed in real-time to keep users satisfied. This data comes from various sources and in many formats, including electronic and mobile devices such as GPRS modems and GPS devices. They make use of different protocols including TCP, UDP, and HTTP/s for data communication to web servers and eventually to users. The data obtained from these devices may provide valuable information to users, but are mostly in an unreadable format which needs to be processed to provide information and business intelligence. This data is not always current, it is mostly historical data. The data is not subject to implementation of consistency and redundancy measures as most other data usually is. Most important to the users is that the data are to be pre-processed in a readable format when it is entered into the database. To accomplish this, programmers build processing programs and scripts to decode and process the information stored in databases. Programmers make use of various techniques in such programs to accomplish this, but sometimes neglect the effect some of these techniques may have on database performance. One of the techniques generally used,is to pull data from the database server, process it and push it back to the database server in one single step. Since the processing of the data usually takes some time, it keeps the database busy and locked for the period of time that the processing takes place. Because of this, it decreases the overall performance of the database server and therefore the system’s performance. This paper follows on a paper discussing the performance increase that may be achieved by utilizing array lists along with a pull-process-push data processing technique split in three steps. The purpose of this paper is to expand the number of clients when comparing the two techniques to establish the impact it may have on performance of the CPU storage and processing time.

Keywords: performance measures, algorithm techniques, data processing, push data, process data, array list

Procedia PDF Downloads 228
24716 Linkages between Postponement Strategies and Flexibility in Organizations

Authors: Polycarpe Feussi

Abstract:

Globalization, technological and customer increasing changes, amongst other drivers, result in higher levels of uncertainty and unpredictability for organizations. In order for organizations to cope with the uncertain and fast-changing economic and business environment, these organizations need to innovate in order to achieve flexibility. In simple terms, the organizations must develop strategies leading to the ability of these organizations to provide horizontal information connections across the supply chain to create and deliver products that meet customer needs by synchronization of customer demands with product creation. The generated information will create efficiency and effectiveness throughout the whole supply chain regarding production, storage, and distribution, as well as eliminating redundant activities and reduction in response time. In an integrated supply chain, spanning activities include coordination with distributors and suppliers. This paper explains how through postponement strategies, flexibility can be achieved in an organization. In order to achieve the above, a thorough literature review was conducted via the search of online websites that contains material from scientific journal data-bases, articles, and textbooks on the subject of postponement and flexibility. The findings of the research are found in the last part of the paper. The first part introduces the concept of postponement and its importance in supply chain management. The second part of the paper provides the methodology used in the process of writing the paper.

Keywords: postponement strategies, supply chain management, flexibility, logistics

Procedia PDF Downloads 181
24715 Effect of Serine/Threonine Kinases on Autophagy Mechanism

Authors: Ozlem Oral, Seval Kilic, Ozlem Yedier, Serap Dokmeci, Devrim Gozuacik

Abstract:

Autophagy is a degradation pathway, activating under stress conditions. It digests macromolecules, such as abnormal proteins and long-lived organelles by engulfing them and by subsequent delivery of the cargo to lysosomes. The members of the phospholipid-dependent serine/threonine kinases, involved in many signaling pathways, which are necessary for the regulation of cellular metabolic activation. Previous studies implicate that, serine/threonine kinases have crucial roles in the mechanism of many diseases depend on the activated and/or inactivated signaling pathway. Data indicates, the signaling pathways activated by serine/threonine kinases are also involved in activation of autophagy mechanism. However, the information about the effect of serine/threonine kinases on autophagy mechanism and the roles of these effects in disease formation is limited. In this study, we investigated the effect of activated serine/threonine kinases on autophagic pathway. We performed a commonly used autophagy technique, GFP-LC3 dot formation and by using microscopy analyses, we evaluated promotion and/or inhibition of autophagy in serine/threonine kinase-overexpressed fibroblasts as well as cancer cells. In addition, we carried out confocal microscopy analyses and examined autophagic flux by utilizing the differential pH sensitivities of RFP and GFP in mRFP-GFP-LC3 probe. Based on the shRNA-library based screening, we identified autophagy-related proteins affected by serine/threonine kinases. We further studied the involvement of serine/threonine kinases on the molecular mechanism of newly identified autophagy proteins and found that, autophagic pathway is indirectly controlled by serine/threonine kinases via specific autophagic proteins. Our data indicate the molecular connection between two critical cellular mechanisms, which have important roles in the formation of many disease pathologies, particularly cancer. This project is supported by TUBITAK-1001-Scientific and Technological Research Projects Funding Program, Project No: 114Z836.

Keywords: autophagy, GFP-LC3 dot formation assay, serine/threonine kinases, shRNA-library screening

Procedia PDF Downloads 274
24714 Third Eye: A Hybrid Portrayal of Visuospatial Attention through Eye Tracking Research and Modular Arithmetic

Authors: Shareefa Abdullah Al-Maqtari, Ruzaika Omar Basaree, Rafeah Legino

Abstract:

A pictorial representation of hybrid forms in science-art collaboration has become a crucial issue in the course of exploring a new painting technique development. This is straight related to the reception of an invisible-recognition phenomenology. In hybrid pictorial representation of invisible-recognition phenomenology, the challenging issue is how to depict the pictorial features of indescribable objects from its mental source, modality and transparency. This paper proposes the hybrid technique of painting Demonstrate, Resemble, and Synthesize (DRS) through a combination of the hybrid aspect-recognition representation of understanding picture, demonstrative mod, the number theory, pattern in the modular arithmetic system, and the coherence theory of visual attention in the dynamic scenes representation. Multi-methods digital gaze data analyses, pattern-modular table operation design, and rotation parameter were used for the visualization. In the scientific processes, Eye-trackingvideo-sections based was conducted using Tobii T60 remote eye tracking hardware and TobiiStudioTM analysis software to collect and analyze the eye movements of ten participants when watching the video clip, Alexander Paulikevitch’s performance’s ‘Tajwal’. Results: we found that correlation of fixation count in section one was positively and moderately correlated with section two Person’s (r=.10, p < .05, 2-tailed) as well as in fixation duration Person’s (r=.10, p < .05, 2-tailed). However, a paired-samples t-test indicates that scores were significantly higher for the section one (M = 2.2, SD = .6) than for the section two (M = 1.93, SD = .6) t(9) = 2.44, p < .05, d = 0.87. In the visual process, the exported data of gaze number N was resembled the hybrid forms of visuospatial attention using the table-mod-analyses operation. The explored hybrid guideline was simply applicable, and it could be as alternative approach to the sustainability of contemporary visual arts.

Keywords: science-art collaboration, hybrid forms, pictorial representation, visuospatial attention, modular arithmetic

Procedia PDF Downloads 349
24713 Green Accounting and Firm Performance: A Bibliometric Literature Review

Authors: Francesca di Donato, Sara Trucco

Abstract:

Green accounting is a growing topic of interest. Indeed, nowadays, most firms affect the environment; therefore, companies are seeking the best way to disclose environmental information. Furthermore, companies are increasingly committed to improving the environment, and the topic is gaining more importance to the public, governments, and policymakers. Green accounting is a type of accounting that considers environmental costs and their impact on the financial performance of firms. Thus, the motivation of the current research is to investigate the state-of-the-art literature on the relationship between green accounting and firm performance since the birth of the topic of green accounting and to investigate gaps in the literature that represent fruitful terrain for future research. In doing so, this study provides a bibliometric literature review of existing evidence related to the link between green accounting and firm performance since 2000. The search, based on the most relevant databases for scientific journals (which are Scopus, Emerald, Web of Science, Google Scholar, and Econlit), returned 1917 scientific articles. The articles were manually reviewed in order to identify only the relevant studies in the field by excluding articles with titles and abstracts out of scope. The final sample was composed of 107 articles. A content analysis was carried out on the final sample of articles; in doing so, a classification system has been proposed. Findings show the most relevant environmental costs and issues considered in previous studies and how green accounting may be linked to the financial and non-financial performance of a firm. The study also offers suggestions for future research in this domain. This study has several practical implications. Indeed, the topic of green accounting may be applied to different sectors and different types of companies. Therefore, this study may help managers to better understand the most relevant environmental information to disclose and how environmental issues may be managed to improve the performance of the firms. Moreover, the bibliometric literature review may be of interest to those stakeholders who are interested in the historical evolution of the topic.

Keywords: bibliometric literature review, firm performance, green accounting, literature review

Procedia PDF Downloads 46
24712 Extreme Temperature Forecast in Mbonge, Cameroon Through Return Level Analysis of the Generalized Extreme Value (GEV) Distribution

Authors: Nkongho Ayuketang Arreyndip, Ebobenow Joseph

Abstract:

In this paper, temperature extremes are forecast by employing the block maxima method of the generalized extreme value (GEV) distribution to analyse temperature data from the Cameroon Development Corporation (CDC). By considering two sets of data (raw data and simulated data) and two (stationary and non-stationary) models of the GEV distribution, return levels analysis is carried out and it was found that in the stationary model, the return values are constant over time with the raw data, while in the simulated data the return values show an increasing trend with an upper bound. In the non-stationary model, the return levels of both the raw data and simulated data show an increasing trend with an upper bound. This clearly shows that although temperatures in the tropics show a sign of increase in the future, there is a maximum temperature at which there is no exceedance. The results of this paper are very vital in agricultural and environmental research.

Keywords: forecasting, generalized extreme value (GEV), meteorology, return level

Procedia PDF Downloads 453
24711 The Family Resemblance in the Handwriting of Painters: Jacek and Rafał Malczewski’s Case

Authors: Olivia Rybak-Karkosz

Abstract:

This paper aims to present the results of scientific research on family resemblance in the handwriting of painters. Such a problem is known in handwriting analysis, but it was never a research subject in the scope of painters' signatures on works of art. For this research, the author chose Jacek, and Rafał Malczewski (father and son) as many of their paintings are in museums, and most of them are signed. The aim was to create a catalogue of traits similar to the handwriting of both artists. Such data could be helpful for the expert’s opinion in the decision-making process to establish whether the signature is authentic and, if so, whether it is the artist whose signature is analysed, not the other family member. There are known examples of relatives of the artists who signed their works. Many of them were artists themselves. For instance Andrzej Wróblewski’s mother, Krystyna was a printmaker. To save his legacy, she signed many of her son’s works after his death using his name. This research methodology consisted of completing representative samples of signatures of both artists, which were collected in selected Polish museums. Then a catalogue of traits was created using a forensic handwriting graphic-comparative method (graphic method). The paper contains a concluding statement that it could be one of the elements of research in an expert’s analysis of the authenticity of the signature on paintings.

Keywords: artist’s signatures, authenticity of an artwork, forensic handwriting analysis, graphic-comparative method

Procedia PDF Downloads 69
24710 Impact of Stack Caches: Locality Awareness and Cost Effectiveness

Authors: Abdulrahman K. Alshegaifi, Chun-Hsi Huang

Abstract:

Treating data based on its location in memory has received much attention in recent years due to its different properties, which offer important aspects for cache utilization. Stack data and non-stack data may interfere with each other’s locality in the data cache. One of the important aspects of stack data is that it has high spatial and temporal locality. In this work, we simulate non-unified cache design that split data cache into stack and non-stack caches in order to maintain stack data and non-stack data separate in different caches. We observe that the overall hit rate of non-unified cache design is sensitive to the size of non-stack cache. Then, we investigate the appropriate size and associativity for stack cache to achieve high hit ratio especially when over 99% of accesses are directed to stack cache. The result shows that on average more than 99% of stack cache accuracy is achieved by using 2KB of capacity and 1-way associativity. Further, we analyze the improvement in hit rate when adding small, fixed, size of stack cache at level1 to unified cache architecture. The result shows that the overall hit rate of unified cache design with adding 1KB of stack cache is improved by approximately, on average, 3.9% for Rijndael benchmark. The stack cache is simulated by using SimpleScalar toolset.

Keywords: hit rate, locality of program, stack cache, stack data

Procedia PDF Downloads 287
24709 Application of a Generalized Additive Model to Reveal the Relations between the Density of Zooplankton with Other Variables in the West Daya Bay, China

Authors: Weiwen Li, Hao Huang, Chengmao You, Jianji Liao, Lei Wang, Lina An

Abstract:

Zooplankton are a central issue in the ecology which makes a great contribution to maintaining the balance of an ecosystem. It is critical in promoting the material cycle and energy flow within the ecosystems. A generalized additive model (GAM) was applied to analyze the relationships between the density (individuals per m³) of zooplankton and other variables in West Daya Bay. All data used in this analysis (the survey month, survey station (longitude and latitude), the depth of the water column, the superficial concentration of chlorophyll a, the benthonic concentration of chlorophyll a, the number of zooplankton species and the number of zooplankton species) were collected through monthly scientific surveys during January to December 2016. GLM model (generalized linear model) was used to choose the significant variables’ impact on the density of zooplankton, and the GAM was employed to analyze the relationship between the density of zooplankton and the significant variables. The results showed that the density of zooplankton increased with an increase of the benthonic concentration of chlorophyll a, but decreased with a decrease in the depth of the water column. Both high numbers of zooplankton species and the overall total number of zooplankton individuals led to a higher density of zooplankton.

Keywords: density, generalized linear model, generalized additive model, the West Daya Bay, zooplankton

Procedia PDF Downloads 133
24708 Autonomic Threat Avoidance and Self-Healing in Database Management System

Authors: Wajahat Munir, Muhammad Haseeb, Adeel Anjum, Basit Raza, Ahmad Kamran Malik

Abstract:

Databases are the key components of the software systems. Due to the exponential growth of data, it is the concern that the data should be accurate and available. The data in databases is vulnerable to internal and external threats, especially when it contains sensitive data like medical or military applications. Whenever the data is changed by malicious intent, data analysis result may lead to disastrous decisions. Autonomic self-healing is molded toward computer system after inspiring from the autonomic system of human body. In order to guarantee the accuracy and availability of data, we propose a technique which on a priority basis, tries to avoid any malicious transaction from execution and in case a malicious transaction affects the system, it heals the system in an isolated mode in such a way that the availability of system would not be compromised. Using this autonomic system, the management cost and time of DBAs can be minimized. In the end, we test our model and present the findings.

Keywords: autonomic computing, self-healing, threat avoidance, security

Procedia PDF Downloads 490
24707 Information Extraction Based on Search Engine Results

Authors: Mohammed R. Elkobaisi, Abdelsalam Maatuk

Abstract:

The search engines are the large scale information retrieval tools from the Web that are currently freely available to all. This paper explains how to convert the raw resulted number of search engines into useful information. This represents a new method for data gathering comparing with traditional methods. When a query is submitted for a multiple numbers of keywords, this take a long time and effort, hence we develop a user interface program to automatic search by taking multi-keywords at the same time and leave this program to collect wanted data automatically. The collected raw data is processed using mathematical and statistical theories to eliminate unwanted data and converting it to usable data.

Keywords: search engines, information extraction, agent system

Procedia PDF Downloads 412
24706 Responsibility to Protect in Practice: Libya and Syria

Authors: Guram Esakia, Giorgi Goguadze

Abstract:

The following paper is written due to overview the concept of R2P, this new dimension in International Relations field. Paper contains the general description of previously mentioned concept, its advantages and disadvantages. We also compare each other R2P and“humanitarian intervention“, trying to make clear division between these two approaches in conflict solution. There is also discussed R2P in real action, successful one in Libya and yet failed in Syria. Essay doesn’t claim to be the part of scientific chain and is based only on personal subjection as well on information gathered from various scholars and UN resolutions.

Keywords: the concept of R2P, humanitarian intervention, Libya, Syria

Procedia PDF Downloads 267
24705 Implementation and Performance Analysis of Data Encryption Standard and RSA Algorithm with Image Steganography and Audio Steganography

Authors: S. C. Sharma, Ankit Gambhir, Rajeev Arya

Abstract:

In today’s era data security is an important concern and most demanding issues because it is essential for people using online banking, e-shopping, reservations etc. The two major techniques that are used for secure communication are Cryptography and Steganography. Cryptographic algorithms scramble the data so that intruder will not able to retrieve it; however steganography covers that data in some cover file so that presence of communication is hidden. This paper presents the implementation of Ron Rivest, Adi Shamir, and Leonard Adleman (RSA) Algorithm with Image and Audio Steganography and Data Encryption Standard (DES) Algorithm with Image and Audio Steganography. The coding for both the algorithms have been done using MATLAB and its observed that these techniques performed better than individual techniques. The risk of unauthorized access is alleviated up to a certain extent by using these techniques. These techniques could be used in Banks, RAW agencies etc, where highly confidential data is transferred. Finally, the comparisons of such two techniques are also given in tabular forms.

Keywords: audio steganography, data security, DES, image steganography, intruder, RSA, steganography

Procedia PDF Downloads 272
24704 Data Monetisation by E-commerce Companies: A Need for a Regulatory Framework in India

Authors: Anushtha Saxena

Abstract:

This paper examines the process of data monetisation bye-commerce companies operating in India. Data monetisation is collecting, storing, and analysing consumers’ data to use further the data that is generated for profits, revenue, etc. Data monetisation enables e-commerce companies to get better businesses opportunities, innovative products and services, a competitive edge over others to the consumers, and generate millions of revenues. This paper analyses the issues and challenges that are faced due to the process of data monetisation. Some of the issues highlighted in the paper pertain to the right to privacy, protection of data of e-commerce consumers. At the same time, data monetisation cannot be prohibited, but it can be regulated and monitored by stringent laws and regulations. The right to privacy isa fundamental right guaranteed to the citizens of India through Article 21 of The Constitution of India. The Supreme Court of India recognized the Right to Privacy as a fundamental right in the landmark judgment of Justice K.S. Puttaswamy (Retd) and Another v. Union of India . This paper highlights the legal issue of how e-commerce businesses violate individuals’ right to privacy by using the data collected, stored by them for economic gains and monetisation and protection of data. The researcher has mainly focused on e-commerce companies like online shopping websitesto analyse the legal issue of data monetisation. In the Internet of Things and the digital age, people have shifted to online shopping as it is convenient, easy, flexible, comfortable, time-consuming, etc. But at the same time, the e-commerce companies store the data of their consumers and use it by selling to the third party or generating more data from the data stored with them. This violatesindividuals’ right to privacy because the consumers do not know anything while giving their data online. Many times, data is collected without the consent of individuals also. Data can be structured, unstructured, etc., that is used by analytics to monetise. The Indian legislation like The Information Technology Act, 2000, etc., does not effectively protect the e-consumers concerning their data and how it is used by e-commerce businesses to monetise and generate revenues from that data. The paper also examines the draft Data Protection Bill, 2021, pending in the Parliament of India, and how this Bill can make a huge impact on data monetisation. This paper also aims to study the European Union General Data Protection Regulation and how this legislation can be helpful in the Indian scenarioconcerning e-commerce businesses with respect to data monetisation.

Keywords: data monetization, e-commerce companies, regulatory framework, GDPR

Procedia PDF Downloads 101
24703 Experiments on Weakly-Supervised Learning on Imperfect Data

Authors: Yan Cheng, Yijun Shao, James Rudolph, Charlene R. Weir, Beth Sahlmann, Qing Zeng-Treitler

Abstract:

Supervised predictive models require labeled data for training purposes. Complete and accurate labeled data, i.e., a ‘gold standard’, is not always available, and imperfectly labeled data may need to serve as an alternative. An important question is if the accuracy of the labeled data creates a performance ceiling for the trained model. In this study, we trained several models to recognize the presence of delirium in clinical documents using data with annotations that are not completely accurate (i.e., weakly-supervised learning). In the external evaluation, the support vector machine model with a linear kernel performed best, achieving an area under the curve of 89.3% and accuracy of 88%, surpassing the 80% accuracy of the training sample. We then generated a set of simulated data and carried out a series of experiments which demonstrated that models trained on imperfect data can (but do not always) outperform the accuracy of the training data, e.g., the area under the curve for some models is higher than 80% when trained on the data with an error rate of 40%. Our experiments also showed that the error resistance of linear modeling is associated with larger sample size, error type, and linearity of the data (all p-values < 0.001). In conclusion, this study sheds light on the usefulness of imperfect data in clinical research via weakly-supervised learning.

Keywords: weakly-supervised learning, support vector machine, prediction, delirium, simulation

Procedia PDF Downloads 176
24702 Machine Learning for Exoplanetary Habitability Assessment

Authors: King Kumire, Amos Kubeka

Abstract:

The synergy of machine learning and astronomical technology advancement is giving rise to the new space age, which is pronounced by better habitability assessments. To initiate this discussion, it should be recorded for definition purposes that the symbiotic relationship between astronomy and improved computing has been code-named the Cis-Astro gateway concept. The cosmological fate of this phrase has been unashamedly plagiarized from the cis-lunar gateway template and its associated LaGrange points which act as an orbital bridge to the moon from our planet Earth. However, for this study, the scientific audience is invited to bridge toward the discovery of new habitable planets. It is imperative to state that cosmic probes of this magnitude can be utilized as the starting nodes of the astrobiological search for galactic life. This research can also assist by acting as the navigation system for future space telescope launches through the delimitation of target exoplanets. The findings and the associated platforms can be harnessed as building blocks for the modeling of climate change on planet earth. The notion that if the human genus exhausts the resources of the planet earth or there is a bug of some sort that makes the earth inhabitable for humans explains the need to find an alternative planet to inhabit. The scientific community, through interdisciplinary discussions of the International Astronautical Federation so far has the common position that engineers can reduce space mission costs by constructing a stable cis-lunar orbit infrastructure for refilling and carrying out other associated in-orbit servicing activities. Similarly, the Cis-Astro gateway can be envisaged as a budget optimization technique that models extra-solar bodies and can facilitate the scoping of future mission rendezvous. It should be registered as well that this broad and voluminous catalog of exoplanets shall be narrowed along the way using machine learning filters. The gist of this topic revolves around the indirect economic rationale of establishing a habitability scoping platform.

Keywords: machine-learning, habitability, exoplanets, supercomputing

Procedia PDF Downloads 74