Search results for: panel data method
35692 Development of an Optimization Method for Myoelectric Signal Processing by Active Matrix Sensing in Robot Rehabilitation
Authors: Noriyoshi Yamauchi, Etsuo Horikawa, Takunori Tsuji
Abstract:
Training by exoskeleton robot is drawing attention as a rehabilitation method for body paralysis seen in many cases, and there are many forms that assist with the myoelectric signal generated by exercise commands from the brain. Rehabilitation requires more frequent training, but it is one of the reasons that the technology is required for the identification of the myoelectric potential derivation site and attachment of the device is preventing the spread of paralysis. In this research, we focus on improving the efficiency of gait training by exoskeleton type robots, improvement of myoelectric acquisition and analysis method using active matrix sensing method, and improvement of walking rehabilitation and walking by optimization of robot control.Keywords: active matrix sensing, brain machine interface (BMI), the central pattern generator (CPG), myoelectric signal processing, robot rehabilitation
Procedia PDF Downloads 38535691 Bounded Solution Method for Geometric Programming Problem with Varying Parameters
Authors: Abdullah Ali H. Ahmadini, Firoz Ahmad, Intekhab Alam
Abstract:
Geometric programming problem (GPP) is a well-known non-linear optimization problem having a wide range of applications in many engineering problems. The structure of GPP is quite dynamic and easily fit to the various decision-making processes. The aim of this paper is to highlight the bounded solution method for GPP with special reference to variation among right-hand side parameters. Thus this paper is taken the advantage of two-level mathematical programming problems and determines the solution of the objective function in a specified interval called lower and upper bounds. The beauty of the proposed bounded solution method is that it does not require sensitivity analyses of the obtained optimal solution. The value of the objective function is directly calculated under varying parameters. To show the validity and applicability of the proposed method, a numerical example is presented. The system reliability optimization problem is also illustrated and found that the value of the objective function lies between the range of lower and upper bounds, respectively. At last, conclusions and future research are depicted based on the discussed work.Keywords: varying parameters, geometric programming problem, bounded solution method, system reliability optimization
Procedia PDF Downloads 13335690 Transforming Data into Knowledge: Mathematical and Statistical Innovations in Data Analytics
Authors: Zahid Ullah, Atlas Khan
Abstract:
The rapid growth of data in various domains has created a pressing need for effective methods to transform this data into meaningful knowledge. In this era of big data, mathematical and statistical innovations play a crucial role in unlocking insights and facilitating informed decision-making in data analytics. This abstract aims to explore the transformative potential of these innovations and their impact on converting raw data into actionable knowledge. Drawing upon a comprehensive review of existing literature, this research investigates the cutting-edge mathematical and statistical techniques that enable the conversion of data into knowledge. By evaluating their underlying principles, strengths, and limitations, we aim to identify the most promising innovations in data analytics. To demonstrate the practical applications of these innovations, real-world datasets will be utilized through case studies or simulations. This empirical approach will showcase how mathematical and statistical innovations can extract patterns, trends, and insights from complex data, enabling evidence-based decision-making across diverse domains. Furthermore, a comparative analysis will be conducted to assess the performance, scalability, interpretability, and adaptability of different innovations. By benchmarking against established techniques, we aim to validate the effectiveness and superiority of the proposed mathematical and statistical innovations in data analytics. Ethical considerations surrounding data analytics, such as privacy, security, bias, and fairness, will be addressed throughout the research. Guidelines and best practices will be developed to ensure the responsible and ethical use of mathematical and statistical innovations in data analytics. The expected contributions of this research include advancements in mathematical and statistical sciences, improved data analysis techniques, enhanced decision-making processes, and practical implications for industries and policymakers. The outcomes will guide the adoption and implementation of mathematical and statistical innovations, empowering stakeholders to transform data into actionable knowledge and drive meaningful outcomes.Keywords: data analytics, mathematical innovations, knowledge extraction, decision-making
Procedia PDF Downloads 7535689 A Design Framework for an Open Market Platform of Enriched Card-Based Transactional Data for Big Data Analytics and Open Banking
Authors: Trevor Toy, Josef Langerman
Abstract:
Around a quarter of the world’s data is generated by financial with an estimated 708.5 billion global non-cash transactions reached between 2018 and. And with Open Banking still a rapidly developing concept within the financial industry, there is an opportunity to create a secure mechanism for connecting its stakeholders to openly, legitimately and consensually share the data required to enable it. Integration and data sharing of anonymised transactional data are still operated in silos and centralised between the large corporate entities in the ecosystem that have the resources to do so. Smaller fintechs generating data and businesses looking to consume data are largely excluded from the process. Therefore there is a growing demand for accessible transactional data for analytical purposes and also to support the rapid global adoption of Open Banking. The following research has provided a solution framework that aims to provide a secure decentralised marketplace for 1.) data providers to list their transactional data, 2.) data consumers to find and access that data, and 3.) data subjects (the individuals making the transactions that generate the data) to manage and sell the data that relates to themselves. The platform also provides an integrated system for downstream transactional-related data from merchants, enriching the data product available to build a comprehensive view of a data subject’s spending habits. A robust and sustainable data market can be developed by providing a more accessible mechanism for data producers to monetise their data investments and encouraging data subjects to share their data through the same financial incentives. At the centre of the platform is the market mechanism that connects the data providers and their data subjects to the data consumers. This core component of the platform is developed on a decentralised blockchain contract with a market layer that manages transaction, user, pricing, payment, tagging, contract, control, and lineage features that pertain to the user interactions on the platform. One of the platform’s key features is enabling the participation and management of personal data by the individuals from whom the data is being generated. This framework developed a proof-of-concept on the Etheruem blockchain base where an individual can securely manage access to their own personal data and that individual’s identifiable relationship to the card-based transaction data provided by financial institutions. This gives data consumers access to a complete view of transactional spending behaviour in correlation to key demographic information. This platform solution can ultimately support the growth, prosperity, and development of economies, businesses, communities, and individuals by providing accessible and relevant transactional data for big data analytics and open banking.Keywords: big data markets, open banking, blockchain, personal data management
Procedia PDF Downloads 7335688 Corporate Governance Development in Mongolia: The Role of Professional Accountants
Authors: Ernest Nweke
Abstract:
The work of Professional Accountants and Corporate governance are synonymous and cannot be divorced from each other. Organizations, profit and non-profit alike cannot implement sound corporate practices without inputs from Professional Accountants. In today’s dynamic corporate world, good corporate governance practice is a sine qua non. More so, following the corporate failures of the past decades like Enron and WorldCom, governments around the world, including Mongolia are becoming more proactive in ensuring sound corporate governance mechanisms. In the past fifteen years, the Mongolian government has taken several measures to establish and strengthen internal corporate governance structures in firms. This paper highlights the role of professional accountants and auditors play in ensuring that good corporate governance mechanisms are entrenched in listed companies in Mongolia. Both primary and secondary data are utilized in this research. In collection of primary data, Delphi method was used, securing responses from only knowledgeable senior employees, top managers, and some CEOs. Using this method, a total of 107 top-level company employees and executives randomly selected from 22 companies were surveyed; maximum of 5 and minimum of 4 from each company. These companies cut across several sectors. It was concluded that Professional Accountants play key roles in setting and maintaining firm governance. They do this by ensuring full compliance with all the requirements of good and sound corporate governance, establishing reporting, monitoring and evaluating standards, assisting in the setting up of proper controls, efficient and effective audit systems, sound fraud risk management and putting in place an overall vision for the enterprise. Companies with effective corporate governance mechanisms are usually strong and fraud-resilient. It was also discovered that companies with big 4 audit firms tend to have better governance structures in Mongolia.Keywords: accountants, corporate disclosure, corporate failure, corporate governance
Procedia PDF Downloads 27935687 Optimizing of Machining Parameters of Plastic Material Using Taguchi Method
Authors: Jumazulhisham Abdul Shukor, Mohd. Sazali Said, Roshanizah Harun, Shuib Husin, Ahmad Razlee Ab Kadir
Abstract:
This paper applies Taguchi Optimization Method in determining the best machining parameters for pocket milling process on Polypropylene (PP) using CNC milling machine where the surface roughness is considered and the Carbide inserts cutting tool are used. Three machining parameters; speed, feed rate and depth of cut are investigated along three levels; low, medium and high of each parameter (Taguchi Orthogonal Arrays). The setting of machining parameters were determined by using Taguchi Method and the Signal-to-Noise (S/N) ratio are assessed to define the optimal levels and to predict the effect of surface roughness with assigned parameters based on L9. The final experimental outcomes are presented to prove the optimization parameters recommended by manufacturer are accurate.Keywords: inserts, milling process, signal-to-noise (S/N) ratio, surface roughness, Taguchi Optimization Method
Procedia PDF Downloads 63735686 Experimental Evaluation of Succinct Ternary Tree
Authors: Dmitriy Kuptsov
Abstract:
Tree data structures, such as binary or in general k-ary trees, are essential in computer science. The applications of these data structures can range from data search and retrieval to sorting and ranking algorithms. Naive implementations of these data structures can consume prohibitively large volumes of random access memory limiting their applicability in certain solutions. Thus, in these cases, more advanced representation of these data structures is essential. In this paper we present the design of the compact version of ternary tree data structure and demonstrate the results for the experimental evaluation using static dictionary problem. We compare these results with the results for binary and regular ternary trees. The conducted evaluation study shows that our design, in the best case, consumes up to 12 times less memory (for the dictionary used in our experimental evaluation) than a regular ternary tree and in certain configuration shows performance comparable to regular ternary trees. We have evaluated the performance of the algorithms using both 32 and 64 bit operating systems.Keywords: algorithms, data structures, succinct ternary tree, per- formance evaluation
Procedia PDF Downloads 16035685 4-Chlorophenol Degradation in Water Using TIO₂-X%ZnS Synthesized by One-Step Sol-Gel Method
Authors: M. E. Velásquez Torres, F. Tzompantzi, J. C. Castillo-Rodríguez, A. G. Romero Villegas, S. Mendéz-Salazar, C. E. Santolalla-Vargas, J. Cardoso-Martínez
Abstract:
Photocatalytic degradation, as an advanced oxidation technology, is a promising method in organic pollutant degradation. In this sense, chlorophenols should be removed from the water because they are highly toxic. The TiO₂ - X% ZnS photocatalysts, where X represents the molar percentage of ZnS (3%, 5%, 10%, and 15%), were synthesized using the one-step sol-gel method to use them as photocatalysts to degrade 4-chlorophenol. The photocatalysts were synthesized by a one-step sol-gel method. They were refluxed for 36 hours, dried at 80°C, and calcined at 400°C. They were labeled TiO₂ - X%ZnS, where X represents the molar percentage of ZnS (3%, 5%, 10%, and 15%). The band gap was calculated using a Cary 100 UV-Visible Spectrometer with an integrating sphere accessory. Ban gap value of each photocatalyst was: 2.7 eV of TiO₂, 2.8 eV of TiO₂ - 3%ZnS and TiO₂ - 5%ZnS, 2.9 eV of TiO₂ - 10%ZnS and 2.6 eV of TiO2 - 15%ZnS. In a batch type reactor, under the irradiation of a mercury lamp (λ = 254 nm, Pen-Ray), degradations of 55 ppm 4-chlorophenol were obtained at 360 minutes with the synthesized photocatalysts: 60% (3% ZnS), 66% (5% ZnS), 74% (10% ZnS) and 58% (15% ZnS). In this sense, the best material as a photocatalyst was TiO₂ -10%ZnS with a degradation percentage of 74%.Keywords: 4-chlorophenol, photocatalysis, water pollutant, sol-gel
Procedia PDF Downloads 13135684 Chain Networks on Internationalization of SMEs: Co-Opetition Strategies in Agrifood Sector
Authors: Emilio Galdeano-Gómez, Juan C. Pérez-Mesa, Laura Piedra-Muñoz, María C. García-Barranco, Jesús Hernández-Rubio
Abstract:
The situation in which firms engage in simultaneous cooperation and competition with each other is a phenomenon known as co-opetition. This scenario has received increasing attention in business economics and management analyses. In the domain of supply chain networks and for small and medium-sized enterprises, SMEs, these strategies are of greater relevance given the complex environment of globalization and competition in open markets. These firms face greater challenges regarding technology and access to specific resources due to their limited capabilities and limited market presence. Consequently, alliances and collaborations with both buyers and suppliers prove to be key elements in overcoming these constraints. However, rivalry and competition are also regarded as major factors in successful internationalization processes, as they are drivers for firms to attain a greater degree of specialization and to improve efficiency, for example enabling them to allocate scarce resources optimally and providing incentives for innovation and entrepreneurship. The present work aims to contribute to the literature on SMEs’ internationalization strategies. The sample is constituted by a panel data of marketing firms from the Andalusian food sector and a multivariate regression analysis is developed, measuring variables of co-opetition and international activity. The hierarchical regression equations method has been followed, thus resulting in three estimated models: the first one excluding the variables indicative of channel type, while the latter two include the international retailer chain and wholesaler variable. The findings show that the combination of several factors leads to a complex scenario of inter-organizational relationships of cooperation and competition. In supply chain management analyses, these relationships tend to be classified as either buyer-supplier (vertical level) or supplier-supplier relationships (horizontal level). Several buyers and suppliers tend to participate in supply chain networks, and in which the form of governance (hierarchical and non-hierarchical) influences cooperation and competition strategies. For instance, due to their market power and/or their closeness to the end consumer, some buyers (e.g. large retailers in food markets) can exert an influence on the selection and interaction of several of their intermediate suppliers, thus endowing certain networks in the supply chain with greater stability. This hierarchical influence may in turn allow these suppliers to develop their capabilities (e.g. specialization) to a greater extent. On the other hand, for those suppliers that are outside these networks, this environment of hierarchy, characterized by a “hub firm” or “channel master”, may provide an incentive for developing their co-opetition relationships. These results prove that the analyzed firms have experienced considerable growth in sales to new foreign markets, mainly in Europe, dealing with large retail chains and wholesalers as main buyers. This supply industry is predominantly made up of numerous SMEs, which has implied a certain disadvantage when dealing with the buyers, as negotiations have traditionally been held on an individual basis and in the face of high competition among suppliers. Over recent years, however, cooperation among these marketing firms has become more common, for example regarding R&D, promotion, scheduling of production and sales.Keywords: co-petition networks, international supply chain, maketing agrifood firms, SMEs strategies
Procedia PDF Downloads 7935683 Nanoparticle Exposure Levels in Indoor and Outdoor Demolition Sites
Authors: Aniruddha Mitra, Abbas Rashidi, Shane Lewis, Jefferson Doehling, Alexis Pawlak, Jacob Schwartz, Imaobong Ekpo, Atin Adhikari
Abstract:
Working or living close to demolition sites can increase risks of dust-related health problems. Demolition of concrete buildings may produce crystalline silica dust, which can be associated with a broad range of respiratory diseases including silicosis and lung cancers. Previous studies demonstrated significant associations between demolition dust exposure and increase in the incidence of mesothelioma or asbestos cancer. Dust is a generic term used for minute solid particles of typically <500 µm in diameter. Dust particles in demolition sites vary in a wide range of sizes. Larger particles tend to settle down from the air. On the other hand, the smaller and lighter solid particles remain dispersed in the air for a long period and pose sustained exposure risks. Submicron ultrafine particles and nanoparticles are respirable deeper into our alveoli beyond our body’s natural respiratory cleaning mechanisms such as cilia and mucous membranes and are likely to be retained in the lower airways. To our knowledge, how various demolition tasks release nanoparticles are largely unknown and previous studies mostly focused on course dust, PM2.5, and PM10. General belief is that the dust generated during demolition tasks are mostly large particles formed through crushing, grinding, or sawing of various concrete and wooden structures. Therefore, little consideration has been given to the generated submicron ultrafine and nanoparticles and their exposure levels. These data are, however, critically important because recent laboratory studies have demonstrated cytotoxicity of nanoparticles on lung epithelial cells. The above-described knowledge gaps were addressed in this study by a novel newly developed nanoparticle monitor, which was used for nanoparticle monitoring at two adjacent indoor and outdoor building demolition sites in southern Georgia. Nanoparticle levels were measured (n = 10) by TSI NanoScan SMPS Model 3910 at four different distances (5, 10, 15, and 30 m) from the work location as well as in control sites. Temperature and relative humidity levels were recorded. Indoor demolition works included acetylene torch, masonry drilling, ceiling panel removal, and other miscellaneous tasks. Whereas, outdoor demolition works included acetylene torch and skid-steer loader use to remove a HVAC system. Concentration ranges of nanoparticles of 13 particle sizes at the indoor demolition site were: 11.5 nm: 63 – 1054/cm³; 15.4 nm: 170 – 1690/cm³; 20.5 nm: 321 – 730/cm³; 27.4 nm: 740 – 3255/cm³; 36.5 nm: 1,220 – 17,828/cm³; 48.7 nm: 1,993 – 40,465/cm³; 64.9 nm: 2,848 – 58,910/cm³; 86.6 nm: 3,722 – 62,040/cm³; 115.5 nm: 3,732 – 46,786/cm³; 154 nm: 3,022 – 21,506/cm³; 205.4 nm: 12 – 15,482/cm³; 273.8 nm:35682 Assessing the Incapacity of Indonesian Aviators Medical Conditions in 2016 – 2017
Authors: Ferdi Afian, Inne Yuliawati
Abstract:
Background: The change in causes of death from infectious diseases to non-communicable diseases also occurs in the aviation community in Indonesia. Non-communicable diseases are influenced by several internal risk factors, such as age, lifestyle changes and the presence of other diseases. These risk factors will increase the incidence of heart diseases resulting in the incapacity of Indonesian aviators which will disrupt flight safety. Method: The study was conducted by collecting secondary data. The retrieval of primary data was obtained from medical records at the Indonesian Aviation Health Center in 2016-2017. The subjects in this study were all cases of incapacity in Indonesian aviators medical conditions. Results: In this study, there were 15 cases of aviators in Indonesia who experienced incapacity of medical conditions related to heart and lung diseases in 2016-2017. Based on the secondary data contained in the flight medical records at the Aviation Health Center Aviation, it was found that several factors related to aviators incapacity causing its inability to carried out flight duties. Conclusion: Incapacity of Indonesian aviators medical conditions are most affected by the high value of Body Mass Index (86%) and less affected by high of Uric Acid in the blood (26%) and Hyperglycemia (26%).Keywords: incapacity, aviators, flight, Indonesia
Procedia PDF Downloads 13435681 Furniture Embodied Carbon Calculator for Interior Design Projects
Authors: Javkhlan Nyamjav, Simona Fischer, Lauren Garner, Veronica McCracken
Abstract:
Current whole building life cycle assessments (LCA) primarily focus on structural and major architectural elements to measure building embodied carbon. Most of the interior finishes and fixtures are available on digital tools (such as Tally); however, furniture is still left unaccounted for. Due to its repeated refreshments and its complexity, furniture embodied carbon can accumulate over time, becoming comparable to structure and envelope numbers. This paper presents a method to calculate the Global Warming Potential (GWP) of furniture elements in commercial buildings. The calculator uses the quantity takeoff method with GWP averages gathered from environmental product declarations (EPD). The data was collected from EPD databases and furniture manufacturers from North America to Europe. A total of 48 GWP numbers were collected, with 16 GWP coming from alternative EPD. The finalized calculator shows the average GWP of typical commercial furniture and helps the decision-making process to reduce embodied carbon. The calculator was tested on MSR Design projects and showed furniture can account for more than half of the interior embodied carbon. The calculator highlights the importance of adding furniture to the overall conversation. However, the data collection process showed a) acquiring furniture EPD is not straightforward as other building materials; b) there are very limited furniture EPD, which can be explained from many perspectives, including the EPD price; c) the EPD themselves vary in terms of units, LCA scopes, and timeframes, which makes it hard to compare the products. Even though there are current limitations, the emerging focus on interior embodied carbon will create more demand for furniture EPD. It will allow manufacturers to represent all their efforts on reducing embodied carbon. In addition, the study concludes with recommendations on how designers can reduce furniture-embodied carbon through reuse and closed-loop systems.Keywords: furniture, embodied carbon, calculator, tenant improvement, interior design
Procedia PDF Downloads 21735680 Deradicalization for Former Terrorists through Entrepreneurship Program
Authors: Jamal Wiwoho, Pujiyono, Triyanto
Abstract:
Terrorism is a real enemy for all countries, including Indonesia. Bomb attacks in some parts of Indonesia are proof that Indonesia has serious problems with terrorism. Perpetrators of terror are arrested and imprisoned, and some of them were executed. However, this method did not succeed in stopping the terrorist attacks. Former terrorists continue to carry out bomb attacks. Therefore, this paper proposes a program towards deradicalization efforts of former terrorists through entrepreneurship. This is necessary because it is impossible to change their radical ideology. The program is also motivated by understanding that terrorists generally come from poor families. This program aims to occupy their time with business activities so there is no time to plan and carry out bomb attacks. This research is an empirical law study. Data were collected by literature study, observation, and in-depth interviews. Data were analyzed with the Miles and Huberman interactive model. The results show that the entrepreneurship program is effective to prevent terrorist attack. Former terrorists are busy with their business. Therefore, they have no time to carry out bomb attacks.Keywords: deradicalization, terrorism, terrorists, entrepreneurship
Procedia PDF Downloads 27135679 A Phenomenological Method Based on Professional Descriptions of Community-of-Practice Members to Scientifically Determine the Level of Child Psycho-Social-Emotional Development
Authors: Gianni Jacucci
Abstract:
Alfred Schutz (1932), at the very turning towards phenomenology, of the attention for the social sciences, stated that successful communication of meanings requires the sharing of “sedimenta-tions “ of previous meanings. Börje Langefors (1966), at the very beginning of the social studies of information systems, stated that a common professional basis is required for a correct sharing of meanings, e. g., “standardised accounting data among accountants”. Harold Garfinkel (1967), at the very beginning of ethnomethodology, stated that the accounting of social events must be carried out in the same language used by the actors of those events in managing their practice. Community of practice: we advocate professional descriptions of the community of practice members to scientifically determine the level of child psycho social emotional development. Our approach consists of an application to Human Sciences of Husserl’s Phenomenological Philosophy using a method reminder of Giorgi’s DPM in Psychology. Husserl’s requirement of "Epoché," which involves eliminating prejudices from the minds of observers, is met through "concept cleaning," achieved by consistently sharing disciplinary concepts within their community of practice. Mean-while, the absence of subjective bias is ensured by the meticulous attention to detail in their professional expertise. Our approach shows promise in accurately assessing many other properties through detailed professional descriptions of the community of practice members.Keywords: scientific rigour, descriptive phenomenological method, sedimentation of meanings, community of practice
Procedia PDF Downloads 5735678 Predicting Data Center Resource Usage Using Quantile Regression to Conserve Energy While Fulfilling the Service Level Agreement
Authors: Ahmed I. Alutabi, Naghmeh Dezhabad, Sudhakar Ganti
Abstract:
Data centers have been growing in size and dema nd continuously in the last two decades. Planning for the deployment of resources has been shallow and always resorted to over-provisioning. Data center operators try to maximize the availability of their services by allocating multiple of the needed resources. One resource that has been wasted, with little thought, has been energy. In recent years, programmable resource allocation has paved the way to allow for more efficient and robust data centers. In this work, we examine the predictability of resource usage in a data center environment. We use a number of models that cover a wide spectrum of machine learning categories. Then we establish a framework to guarantee the client service level agreement (SLA). Our results show that using prediction can cut energy loss by up to 55%.Keywords: machine learning, artificial intelligence, prediction, data center, resource allocation, green computing
Procedia PDF Downloads 10835677 Interface Problems in Construction Projects
Authors: Puti F. Marzuki, Adrianto Oktavianus, Almerinda Regina
Abstract:
Interface problems among interacting parties in Indonesian construction projects have most often led to low productivity and completion delay. In the midst of this country’s needs to accelerate construction of public infrastructure providing connectivity among regions and supporting economic growth as well as better living quality, project delays have to be seriously addressed. This paper identifies potential causes factors of interface problems experienced by construction projects in Indonesia. Data are collected through a survey involving the main actors of six important public infrastructure construction projects including railway, LRT, sports stadiums, apartment, and education building construction projects. Five of these projects adopt the design-build project delivery method and one applies the design-bid-build scheme. Interface problems’ potential causes are categorized into contract, management, technical experience, coordination, financial, and environmental factors. Research results reveal that, especially in railway and LRT projects, potential causes of interface problems are mainly technical and managerial in nature. These relate to complex construction execution in highly congested areas. Meanwhile, coordination cause factors are mainly found in the education building construction project with loan from a foreign donor. All of the six projects have to resolve interface problems caused by incomplete or low-quality contract documents. This research also shows that the design-bid-build delivery method involving more parties in construction projects tends to induce more interface problem cause factors than the design-build scheme.Keywords: cause factors, construction delays, project delivery method, contract documents
Procedia PDF Downloads 25535676 Fairness in Recommendations Ranking: From Pairwise Approach to Listwise Approach
Authors: Patik Joslin Kenfack, Polyakov Vladimir Mikhailovich
Abstract:
Machine Learning (ML) systems are trained using human generated data that could be biased by implicitly containing racist, sexist, or discriminating data. ML models learn those biases or even amplify them. Recent research in work on has begun to consider issues of fairness. The concept of fairness is extended to recommendation. A recommender system will be considered fair if it doesn’t under rank items of protected group (gender, race, demographic...). Several metrics for evaluating fairness concerns in recommendation systems have been proposed, which take pairs of items as ‘instances’ in fairness evaluation. It doesn’t take in account the fact that the fairness should be evaluated across a list of items. The paper explores a probabilistic approach that generalize pairwise metric by using a list k (listwise) of items as ‘instances’ in fairness evaluation, parametrized by k. We also explore new regularization method based on this metric to improve fairness ranking during model training.Keywords: Fairness, Recommender System, Ranking, Listwise Approach
Procedia PDF Downloads 14835675 Prosperous Digital Image Watermarking Approach by Using DCT-DWT
Authors: Prabhakar C. Dhavale, Meenakshi M. Pawar
Abstract:
In this paper, everyday tons of data is embedded on digital media or distributed over the internet. The data is so distributed that it can easily be replicated without error, putting the rights of their owners at risk. Even when encrypted for distribution, data can easily be decrypted and copied. One way to discourage illegal duplication is to insert information known as watermark, into potentially valuable data in such a way that it is impossible to separate the watermark from the data. These challenges motivated researchers to carry out intense research in the field of watermarking. A watermark is a form, image or text that is impressed onto paper, which provides evidence of its authenticity. Digital watermarking is an extension of the same concept. There are two types of watermarks visible watermark and invisible watermark. In this project, we have concentrated on implementing watermark in image. The main consideration for any watermarking scheme is its robustness to various attacksKeywords: watermarking, digital, DCT-DWT, security
Procedia PDF Downloads 42235674 The Influence of E-Learning on Teachers and Students Educational Interactions in Tehran City
Authors: Hadi Manjiri, Mahdyeh Bakhshi, Ali Jafari, Maryam Salati
Abstract:
This study investigates the influence of e-learning on teacher-student instructional interactions through the mediating role of computer literacy among elementary school teachers in Tehran. The research method is a survey that was conducted among elementary school students in Tehran. A sample size of 338 was determined based on Morgan's table. A stratified random sampling method was used to select 228 women and 110 men for the study. Bagherpour et al.'s computer literacy questionnaire, Elahi et al.'s e-learning questionnaire, and Lourdusamy and Khine's questionnaire on teacher-student instructional interactions were used to measure the variables. The data were analyzed using SPSS and LISREL software. It was found that e-learning affects teacher-student instructional interactions, mediated by teachers' computer literacy. In addition, the results suggest that e-learning predicts a 0.66 change in teacher-student instructional interactions, while computer literacy predicts a 0.56 change in instructional interactions between teachers and students.Keywords: e-learning, instructional interactions, computer literacy, students
Procedia PDF Downloads 11935673 Machine Learning Data Architecture
Authors: Neerav Kumar, Naumaan Nayyar, Sharath Kashyap
Abstract:
Most companies see an increase in the adoption of machine learning (ML) applications across internal and external-facing use cases. ML applications vend output either in batch or real-time patterns. A complete batch ML pipeline architecture comprises data sourcing, feature engineering, model training, model deployment, model output vending into a data store for downstream application. Due to unclear role expectations, we have observed that scientists specializing in building and optimizing models are investing significant efforts into building the other components of the architecture, which we do not believe is the best use of scientists’ bandwidth. We propose a system architecture created using AWS services that bring industry best practices to managing the workflow and simplifies the process of model deployment and end-to-end data integration for an ML application. This narrows down the scope of scientists’ work to model building and refinement while specialized data engineers take over the deployment, pipeline orchestration, data quality, data permission system, etc. The pipeline infrastructure is built and deployed as code (using terraform, cdk, cloudformation, etc.) which makes it easy to replicate and/or extend the architecture to other models that are used in an organization.Keywords: data pipeline, machine learning, AWS, architecture, batch machine learning
Procedia PDF Downloads 6435672 Developing University EFL Students’ Communicative Competence by Using Communicative Approach
Authors: Mutwakel Abdalla Ali Garalzain
Abstract:
The aim of this study is to develop university EFL students’ communicative competence. The descriptive, analytical method was used in this study. To collect the data, the researcher designed two questionnaires, one for university EFL students and the other for English language teachers. The respondents of the study were eighty-eight; 76 university EFL students, and 12 English language teachers. The data obtained were analyzed by using statistical package for social science (SPSS). The findings of the study have revealed that most of the university EFL students are unable to express their ideas properly, although they have an abundance of vocabulary. The findings of the study have also shown that most of the university EFL students have positive attitudes towards communicative competence. The results of the study also identified the best strategies that can be used to enhance university EFL students’ communicative competence in English language teaching. The study recommends that English language textbooks should be compatible with the requirements of the student-centered approach. It also recommends that English language teachers should adopt the communicative approach’s strategies in the EFL classroom.Keywords: applied linguistics, communicative competence , English language teaching, university EFL students
Procedia PDF Downloads 19835671 The Role of Organizational Culture, Organizational Commitment, and Styles of Transformational Leadership towards Employee Performance
Authors: Ahmad Badawi Saluy, Novawiguna Kemalasari
Abstract:
This study aims to examine and analyze the influence of organizational culture, organizational commitment, and transformational leadership style on employee performance. This study used descriptive survey method with quantitative approach, and questionnaires as a tool used for basic data collection. The sampling technique used is proportionate stratified random sampling technique; all respondents in this study were 70 respondents. The analytical method used in this research is multiple linear regressions. The result of determination coefficient of 52.3% indicates that organizational culture, organizational commitment, and transformational leadership style simultaneously have a significant influence on the performance of employees, while the remaining 47.7% is explained by other factors outside the research variables. Partially, organization culture has strong and positive influence on employee performance, organizational commitment has a moderate and positive effect on employee performance, while the transformational leadership style has a strong and positive influence on employee performance and this is also the variable that has the most impact on employee performance.Keywords: organizational culture, organizational commitment, transformational leadership style, employee performance
Procedia PDF Downloads 22735670 A Psycholinguistic Analysis of John Nash’s Hallucinations as Represented in the Film “A Beautiful Mind”
Authors: Rizkia Shafarini
Abstract:
The film A Beautiful Mind explores hallucination in this study. A Beautiful Mind depicts the tale of John Nash, a university student who dislikes studying in class or prefers to study alone. Throughout his life, John Nash has hallucinated, or what is known as schizophrenia, as depicted in the film A Beautiful Mind. The goal of this study was to figure out what hallucinations were, what caused them, and how John Nash managed his hallucinations. In general, this study examines the link between language and mind, or the linguistic relationship portrayed in John Nash's character's speech, as evidenced by his conduct. This study takes a psycholinguistic approach to data analysis by employing qualitative methodologies. Data sources include talks and scenes from the film A Beautiful Mind. Hearing, seeing, and feeling are the scientific results of John Nash's hallucinations in the film A Beautiful Mind. Second, dreams, aspirations, and sickness are the sources of John Nash's hallucinations. Third, John Nash's method of managing hallucinations is to see a doctor without medical or distracting assistance.Keywords: A Beautiful Mind, hallucination, psycholinguistic, John Nash
Procedia PDF Downloads 17235669 A Runge Kutta Discontinuous Galerkin Method for Lagrangian Compressible Euler Equations in Two-Dimensions
Authors: Xijun Yu, Zhenzhen Li, Zupeng Jia
Abstract:
This paper presents a new cell-centered Lagrangian scheme for two-dimensional compressible flow. The new scheme uses a semi-Lagrangian form of the Euler equations. The system of equations is discretized by Discontinuous Galerkin (DG) method using the Taylor basis in Eulerian space. The vertex velocities and the numerical fluxes through the cell interfaces are computed consistently by a nodal solver. The mesh moves with the fluid flow. The time marching is implemented by a class of the Runge-Kutta (RK) methods. A WENO reconstruction is used as a limiter for the RKDG method. The scheme is conservative for the mass, momentum and total energy. The scheme maintains second-order accuracy and has free parameters. Results of some numerical tests are presented to demonstrate the accuracy and the robustness of the scheme.Keywords: cell-centered Lagrangian scheme, compressible Euler equations, RKDG method
Procedia PDF Downloads 54635668 Blind Super-Resolution Reconstruction Based on PSF Estimation
Authors: Osama A. Omer, Amal Hamed
Abstract:
Successful blind image Super-Resolution algorithms require the exact estimation of the Point Spread Function (PSF). In the absence of any prior information about the imagery system and the true image; this estimation is normally done by trial and error experimentation until an acceptable restored image quality is obtained. Multi-frame blind Super-Resolution algorithms often have disadvantages of slow convergence and sensitiveness to complex noises. This paper presents a Super-Resolution image reconstruction algorithm based on estimation of the PSF that yields the optimum restored image quality. The estimation of PSF is performed by the knife-edge method and it is implemented by measuring spreading of the edges in the reproduced HR image itself during the reconstruction process. The proposed image reconstruction approach is using L1 norm minimization and robust regularization based on a bilateral prior to deal with different data and noise models. A series of experiment results show that the proposed method can outperform other previous work robustly and efficiently.Keywords: blind, PSF, super-resolution, knife-edge, blurring, bilateral, L1 norm
Procedia PDF Downloads 36535667 A Hybrid Image Fusion Model for Generating High Spatial-Temporal-Spectral Resolution Data Using OLI-MODIS-Hyperion Satellite Imagery
Authors: Yongquan Zhao, Bo Huang
Abstract:
Spatial, Temporal, and Spectral Resolution (STSR) are three key characteristics of Earth observation satellite sensors; however, any single satellite sensor cannot provide Earth observations with high STSR simultaneously because of the hardware technology limitations of satellite sensors. On the other hand, a conflicting circumstance is that the demand for high STSR has been growing with the remote sensing application development. Although image fusion technology provides a feasible means to overcome the limitations of the current Earth observation data, the current fusion technologies cannot enhance all STSR simultaneously and provide high enough resolution improvement level. This study proposes a Hybrid Spatial-Temporal-Spectral image Fusion Model (HSTSFM) to generate synthetic satellite data with high STSR simultaneously, which blends the high spatial resolution from the panchromatic image of Landsat-8 Operational Land Imager (OLI), the high temporal resolution from the multi-spectral image of Moderate Resolution Imaging Spectroradiometer (MODIS), and the high spectral resolution from the hyper-spectral image of Hyperion to produce high STSR images. The proposed HSTSFM contains three fusion modules: (1) spatial-spectral image fusion; (2) spatial-temporal image fusion; (3) temporal-spectral image fusion. A set of test data with both phenological and land cover type changes in Beijing suburb area, China is adopted to demonstrate the performance of the proposed method. The experimental results indicate that HSTSFM can produce fused image that has good spatial and spectral fidelity to the reference image, which means it has the potential to generate synthetic data to support the studies that require high STSR satellite imagery.Keywords: hybrid spatial-temporal-spectral fusion, high resolution synthetic imagery, least square regression, sparse representation, spectral transformation
Procedia PDF Downloads 23535666 Contractor Selection by Using Analytical Network Process
Authors: Badr A. Al-Jehani
Abstract:
Nowadays, contractor selection is a critical activity of the project owner. Selecting the right contractor is essential to the project manager for the success of the project, and this cab happens by using the proper selecting method. Traditionally, the contractor is being selected based on his offered bid price. This approach focuses only on the price factor and forgetting other essential factors for the success of the project. In this research paper, the Analytic Network Process (ANP) method is used as a decision tool model to select the most appropriate contractor. This decision-making method can help the clients who work in the construction industry to identify contractors who are capable of delivering satisfactory outcomes. Moreover, this research paper provides a case study of selecting the proper contractor among three contractors by using ANP method. The case study identifies and computes the relative weight of the eight criteria and eleven sub-criteria using a questionnaire.Keywords: contractor selection, project management, decision-making, bidding
Procedia PDF Downloads 8835665 Comparison of Existing Predictor and Development of Computational Method for S- Palmitoylation Site Identification in Arabidopsis Thaliana
Authors: Ayesha Sanjana Kawser Parsha
Abstract:
S-acylation is an irreversible bond in which cysteine residues are linked to fatty acids palmitate (74%) or stearate (22%), either at the COOH or NH2 terminal, via a thioester linkage. There are several experimental methods that can be used to identify the S-palmitoylation site; however, since they require a lot of time, computational methods are becoming increasingly necessary. There aren't many predictors, however, that can locate S- palmitoylation sites in Arabidopsis Thaliana with sufficient accuracy. This research is based on the importance of building a better prediction tool. To identify the type of machine learning algorithm that predicts this site more accurately for the experimental dataset, several prediction tools were examined in this research, including the GPS PALM 6.0, pCysMod, GPS LIPID 1.0, CSS PALM 4.0, and NBA PALM. These analyses were conducted by constructing the receiver operating characteristics plot and the area under the curve score. An AI-driven deep learning-based prediction tool has been developed utilizing the analysis and three sequence-based input data, such as the amino acid composition, binary encoding profile, and autocorrelation features. The model was developed using five layers, two activation functions, associated parameters, and hyperparameters. The model was built using various combinations of features, and after training and validation, it performed better when all the features were present while using the experimental dataset for 8 and 10-fold cross-validations. While testing the model with unseen and new data, such as the GPS PALM 6.0 plant and pCysMod mouse, the model performed better, and the area under the curve score was near 1. It can be demonstrated that this model outperforms the prior tools in predicting the S- palmitoylation site in the experimental data set by comparing the area under curve score of 10-fold cross-validation of the new model with the established tools' area under curve score with their respective training sets. The objective of this study is to develop a prediction tool for Arabidopsis Thaliana that is more accurate than current tools, as measured by the area under the curve score. Plant food production and immunological treatment targets can both be managed by utilizing this method to forecast S- palmitoylation sites.Keywords: S- palmitoylation, ROC PLOT, area under the curve, cross- validation score
Procedia PDF Downloads 7735664 Preparation of Nanophotonics LiNbO3 Thin Films and Studying Their Morphological and Structural Properties by Sol-Gel Method for Waveguide Applications
Authors: A. Fakhri Makram, Marwa S. Alwazni, Al-Douri Yarub, Evan T. Salim, Hashim Uda, Chin C. Woei
Abstract:
Lithium niobate (LiNbO3) nanostructures are prepared on quartz substrate by the sol-gel method. They have been deposited with different molarity concentration and annealed at 500°C. These samples are characterized and analyzed by X-ray diffraction (XRD), Scanning Electron Microscope (SEM) and Atomic Force Microscopy (AFM). The measured results showed an importance increasing in molarity concentrations that indicate the structure starts to become crystal, regular, homogeneous, well crystal distributed, which made it more suitable for optical waveguide application.Keywords: lithium niobate, morphological properties, thin film, pechini method, XRD
Procedia PDF Downloads 44735663 A Comparison of Image Data Representations for Local Stereo Matching
Authors: André Smith, Amr Abdel-Dayem
Abstract:
The stereo matching problem, while having been present for several decades, continues to be an active area of research. The goal of this research is to find correspondences between elements found in a set of stereoscopic images. With these pairings, it is possible to infer the distance of objects within a scene, relative to the observer. Advancements in this field have led to experimentations with various techniques, from graph-cut energy minimization to artificial neural networks. At the basis of these techniques is a cost function, which is used to evaluate the likelihood of a particular match between points in each image. While at its core, the cost is based on comparing the image pixel data; there is a general lack of consistency as to what image data representation to use. This paper presents an experimental analysis to compare the effectiveness of more common image data representations. The goal is to determine the effectiveness of these data representations to reduce the cost for the correct correspondence relative to other possible matches.Keywords: colour data, local stereo matching, stereo correspondence, disparity map
Procedia PDF Downloads 370