Search results for: data exchange
24066 Challenges and Opportunities of Utilization of Social Media by Business Education Students in Nigeria Universities
Authors: Titus Amodu Umoru
Abstract:
The global economy today is full of sophistication. All over the world, business and marketing practices are undergoing an unprecedented transformation. In realization of this fact, the federal government of Nigeria has put in place a robust transformation agenda in order to put Nigeria in a better position to be a competitive player and in the process transform all sectors of its economy. New technologies, especially the internet, are the driving force behind this transformation. However, technology has inadvertently affected the way businesses are done thus necessitating the acquisition of new skills. In developing countries like Nigeria, citizens are still battling with effective application of those technologies. Obviously, students of business education need to acquire relevant business knowledge to be able to transit into the world of work on graduation from school and compete favourably in the labour market. Therefore, effective utilization of social media by both teachers and students can help extensively in empowering students with the needed skills. Social media which is described as a group of internet-based applications that build on the ideological foundations of Web 2.0, and which allow the creation and exchange of user-generated content, if incorporated into the classroom experience may be the needed answer to unemployment and poverty in Nigeria as beneficiaries can easily connect with existing and potential enterprises and customers, engage with them and reinforce mutual business benefits. Challenges and benefits of social media use in education in Nigeria universities were revealed in this study.Keywords: business education, challenges, opportunities, utilization, social media
Procedia PDF Downloads 41924065 Re-Stating the Origin of Tetrapod Using Measures of Phylogenetic Support for Phylogenomic Data
Authors: Yunfeng Shan, Xiaoliang Wang, Youjun Zhou
Abstract:
Whole-genome data from two lungfish species, along with other species, present a valuable opportunity to re-investigate the longstanding debate regarding the evolutionary relationships among tetrapods, lungfishes, and coelacanths. However, the use of bootstrap support has become outdated for large-scale phylogenomic data. Without robust phylogenetic support, the phylogenetic trees become meaningless. Therefore, it is necessary to re-evaluate the phylogenies of tetrapods, lungfishes, and coelacanths using novel measures of phylogenetic support specifically designed for phylogenomic data, as the previous phylogenies were based on 100% bootstrap support. Our findings consistently provide strong evidence favoring lungfish as the closest living relative of tetrapods. This conclusion is based on high internode certainty, relative gene support, and high gene concordance factor. The evidence stems from five previous datasets derived from lungfish transcriptomes. These results yield fresh insights into the three hypotheses regarding the phylogenies of tetrapods, lungfishes, and coelacanths. Importantly, these hypotheses are not mere conjectures but are substantiated by a significant number of genes. Analyzing real biological data further demonstrates that the inclusion of additional taxa leads to more diverse tree topologies. Consequently, gene trees and species trees may not be identical even when whole-genome sequencing data is utilized. However, it is worth noting that many gene trees can accurately reflect the species tree if an appropriate number of taxa, typically ranging from six to ten, are sampled. Therefore, it is crucial to carefully select the number of taxa and an appropriate outgroup, such as slow-evolving species, while excluding fast-evolving taxa as outgroups to mitigate the adverse effects of long-branch attraction and achieve an accurate reconstruction of the species tree. This is particularly important as more whole-genome sequencing data becomes available.Keywords: novel measures of phylogenetic support for phylogenomic data, gene concordance factor confidence, relative gene support, internode certainty, origin of tetrapods
Procedia PDF Downloads 6624064 Predicting Daily Patient Hospital Visits Using Machine Learning
Authors: Shreya Goyal
Abstract:
The study aims to build user-friendly software to understand patient arrival patterns and compute the number of potential patients who will visit a particular health facility for a given period by using a machine learning algorithm. The underlying machine learning algorithm used in this study is the Support Vector Machine (SVM). Accurate prediction of patient arrival allows hospitals to operate more effectively, providing timely and efficient care while optimizing resources and improving patient experience. It allows for better allocation of staff, equipment, and other resources. If there's a projected surge in patients, additional staff or resources can be allocated to handle the influx, preventing bottlenecks or delays in care. Understanding patient arrival patterns can also help streamline processes to minimize waiting times for patients and ensure timely access to care for patients in need. Another big advantage of using this software is adhering to strict data protection regulations such as the Health Insurance Portability and Accountability Act (HIPAA) in the United States as the hospital will not have to share the data with any third party or upload it to the cloud because the software can read data locally from the machine. The data needs to be arranged in. a particular format and the software will be able to read the data and provide meaningful output. Using software that operates locally can facilitate compliance with these regulations by minimizing data exposure. Keeping patient data within the hospital's local systems reduces the risk of unauthorized access or breaches associated with transmitting data over networks or storing it in external servers. This can help maintain the confidentiality and integrity of sensitive patient information. Historical patient data is used in this study. The input variables used to train the model include patient age, time of day, day of the week, seasonal variations, and local events. The algorithm uses a Supervised learning method to optimize the objective function and find the global minima. The algorithm stores the values of the local minima after each iteration and at the end compares all the local minima to find the global minima. The strength of this study is the transfer function used to calculate the number of patients. The model has an output accuracy of >95%. The method proposed in this study could be used for better management planning of personnel and medical resources.Keywords: machine learning, SVM, HIPAA, data
Procedia PDF Downloads 6824063 Analyzing Keyword Networks for the Identification of Correlated Research Topics
Authors: Thiago M. R. Dias, Patrícia M. Dias, Gray F. Moita
Abstract:
The production and publication of scientific works have increased significantly in the last years, being the Internet the main factor of access and distribution of these works. Faced with this, there is a growing interest in understanding how scientific research has evolved, in order to explore this knowledge to encourage research groups to become more productive. Therefore, the objective of this work is to explore repositories containing data from scientific publications and to characterize keyword networks of these publications, in order to identify the most relevant keywords, and to highlight those that have the greatest impact on the network. To do this, each article in the study repository has its keywords extracted and in this way the network is characterized, after which several metrics for social network analysis are applied for the identification of the highlighted keywords.Keywords: bibliometrics, data analysis, extraction and data integration, scientometrics
Procedia PDF Downloads 26424062 A New Approach towards the Development of Next Generation CNC
Authors: Yusri Yusof, Kamran Latif
Abstract:
Computer Numeric Control (CNC) machine has been widely used in the industries since its inception. Currently, in CNC technology has been used for various operations like milling, drilling, packing and welding etc. with the rapid growth in the manufacturing world the demand of flexibility in the CNC machines has rapidly increased. Previously, the commercial CNC failed to provide flexibility because its structure was of closed nature that does not provide access to the inner features of CNC. Also CNC’s operating ISO data interface model was found to be limited. Therefore, to overcome that problem, Open Architecture Control (OAC) technology and STEP-NC data interface model are introduced. At present the Personal Computer (PC) has been the best platform for the development of open-CNC systems. In this paper, both ISO data interface model interpretation, its verification and execution has been highlighted with the introduction of the new techniques. The proposed is composed of ISO data interpretation, 3D simulation and machine motion control modules. The system is tested on an old 3 axis CNC milling machine. The results are found to be satisfactory in performance. This implementation has successfully enabled sustainable manufacturing environment.Keywords: CNC, ISO 6983, ISO 14649, LabVIEW, open architecture control, reconfigurable manufacturing systems, sustainable manufacturing, Soft-CNC
Procedia PDF Downloads 51924061 A Study on the Establishment of a 4-Joint Based Motion Capture System and Data Acquisition
Authors: Kyeong-Ri Ko, Seong Bong Bae, Jang Sik Choi, Sung Bum Pan
Abstract:
A simple method for testing the posture imbalance of the human body is to check for differences in the bilateral shoulder and pelvic height of the target. In this paper, to check for spinal disorders the authors have studied ways to establish a motion capture system to obtain and express motions of 4-joints, and to acquire data based on this system. The 4 sensors are attached to the both shoulders and pelvis. To verify the established system, the normal and abnormal postures of the targets listening to a lecture were obtained using the established 4-joint based motion capture system. From the results, it was confirmed that the motions taken by the target was identical to the 3-dimensional simulation.Keywords: inertial sensor, motion capture, motion data acquisition, posture imbalance
Procedia PDF Downloads 51924060 How to Improve the Environmental Performance in a HEI in Mexico, an EEA Adaptation
Authors: Stephanie Aguirre Moreno, Jesús Everardo Olguín Tiznado, Claudia Camargo Wilson, Juan Andrés López Barreras
Abstract:
This research work presents a proposal to evaluate the environmental performance of a Higher Education Institution (HEI) in Mexico in order to minimize their environmental impact. Given that public education has limited financial resources, it is necessary to conduct studies that support priorities in decision-making situations and thus obtain the best cost-benefit ratio of continuous improvement programs as part of the environmental management system implemented. The methodology employed, adapted from the Environmental Effect Analysis (EEA), weighs the environmental aspects identified in the environmental diagnosis by two characteristics. Number one, environmental priority through the perception of the stakeholders, compliance of legal requirements, and environmental impact of operations. Number two, the possibility of improvement, which depends of factors such as the exchange rate that will be made, the level of investment and the return time of it. The highest environmental priorities, or hot spots, identified in this evaluation were: electricity consumption, water consumption and recycling, and disposal of municipal solid waste. However, the possibility of improvement for the disposal of municipal solid waste is higher, followed by water consumption and recycling, in spite of having an equal possibility of improvement to the energy consumption, time of return and cost-benefit is much greater.Keywords: environmental performance, environmental priority, possibility of improvement, continuous improvement programs
Procedia PDF Downloads 49924059 Predictive Analytics in Oil and Gas Industry
Authors: Suchitra Chnadrashekhar
Abstract:
Earlier looked as a support function in an organization information technology has now become a critical utility to manage their daily operations. Organizations are processing huge amount of data which was unimaginable few decades before. This has opened the opportunity for IT sector to help industries across domains to handle the data in the most intelligent manner. Presence of IT has been a leverage for the Oil & Gas industry to store, manage and process the data in most efficient way possible thus deriving the economic value in their day-to-day operations. Proper synchronization between Operational data system and Information Technology system is the need of the hour. Predictive analytics supports oil and gas companies by addressing the challenge of critical equipment performance, life cycle, integrity, security, and increase their utilization. Predictive analytics go beyond early warning by providing insights into the roots of problems. To reach their full potential, oil and gas companies need to take a holistic or systems approach towards asset optimization and thus have the functional information at all levels of the organization in order to make the right decisions. This paper discusses how the use of predictive analysis in oil and gas industry is redefining the dynamics of this sector. Also, the paper will be supported by real time data and evaluation of the data for a given oil production asset on an application tool, SAS. The reason for using SAS as an application for our analysis is that SAS provides an analytics-based framework to improve uptimes, performance and availability of crucial assets while reducing the amount of unscheduled maintenance, thus minimizing maintenance-related costs and operation disruptions. With state-of-the-art analytics and reporting, we can predict maintenance problems before they happen and determine root causes in order to update processes for future prevention.Keywords: hydrocarbon, information technology, SAS, predictive analytics
Procedia PDF Downloads 36724058 Urban Change Detection and Pattern Analysis Using Satellite Data
Authors: Shivani Jha, Klaus Baier, Rafiq Azzam, Ramakar Jha
Abstract:
In India, generally people migrate from rural area to the urban area for better infra-structural facilities, high standard of living, good job opportunities and advanced transport/communication availability. In fact, unplanned urban development due to migration of people causes seriou damage to the land use, water pollution and available water resources. In the present work, an attempt has been made to use satellite data of different years for urban change detection of Chennai metropolitan city along with pattern analysis to generate future scenario of urban development using buffer zoning in GIS environment. In the analysis, SRTM (30m) elevation data and IRS-1C satellite data for the years 1990, 2000, and 2014, are used. The flow accumulation, aspect, flow direction and slope maps developed using SRTM 30 m data are very useful for finding suitable urban locations for industrial setup and urban settlements. Normalized difference vegetation index (NDVI) and Principal Component Analysis (PCA) have been used in ERDAS imagine software for change detection in land use of Chennai metropolitan city. It has been observed that the urban area has increased exponentially in Chennai metropolitan city with significant decrease in agriculture and barren lands. However, the water bodies located in the study regions are protected and being used as freshwater for drinking purposes. Using buffer zone analysis in GIS environment, it has been observed that the development has taken place in south west direction significantly and will do so in future.Keywords: urban change, satellite data, the Chennai metropolis, change detection
Procedia PDF Downloads 41324057 HelpMeBreathe: A Web-Based System for Asthma Management
Authors: Alia Al Rayssi, Mahra Al Marar, Alyazia Alkhaili, Reem Al Dhaheri, Shayma Alkobaisi, Hoda Amer
Abstract:
We present in this paper a web-based system called “HelpMeBreathe” for managing asthma. The proposed system provides analytical tools, which allow better understanding of environmental triggers of asthma, hence better support of data-driven decision making. The developed system provides warning messages to a specific asthma patient if the weather in his/her area might cause any difficulty in breathing or could trigger an asthma attack. HelpMeBreathe collects, stores, and analyzes individuals’ moving trajectories and health conditions as well as environmental data. It then processes and displays the patients’ data through an analytical tool that leads to an effective decision making by physicians and other decision makers.Keywords: asthma, environmental triggers, map interface, web-based systems
Procedia PDF Downloads 29624056 The Social Process of Alternative Dispute Resolution and Collective Conciliation: Unveiling the Theoretical Framework
Authors: Adejoke Yemisi Ige
Abstract:
This study presents a conceptual analysis and investigation into the development of a systematic framework required for better understanding of the social process of Alternative Dispute Resolution (ADR) and collective conciliation. The critical examination presented in this study is significant because; it draws on insight from ADR, negotiation and collective bargaining literature and applies it in our advancement of a methodical outline which gives an insight into the influence of the key actors and other stakeholder strategies and behaviours during dispute resolution in relation to the outcomes which is novel. This study is qualitative and essentially inductive in nature. One of the findings of the study confirms the need to consider ADR and collective conciliation within the context of the characteristic conditions; which focus on the need for some agreement to be reached. Another finding of the study shows the extent which information-sharing, willingness of the parties to negotiate and make concession assist both parties to attain resolution. This paper recommends that in order to overcome deadlock and attain acceptable outcomes at the end of ADR and collective conciliation, the importance of information exchange and sustenance of trade union and management relationship cannot be understated. The need for trade unions and management, the representatives to achieve their expectations in order to build the confidence and assurance of their respective constituents is essential. In conclusion, the analysis presented in this study points towards a set of factors that together can be called the social process of collective conciliation nevertheless; it acknowledges that its application to collective conciliation is new.Keywords: alternative dispute resolution, collective conciliation, social process, theoretical framework, unveiling
Procedia PDF Downloads 15524055 The Potential of Tempo-Oxidized Cellulose Nanofibers to Replace EthylenE-propylene-Diene Monomer Rubber
Authors: Sibel Dikmen Kucuk, Yusuf Guner
Abstract:
In recent years, petroleum-based polymers began to be limited due to the effects on the human and environmental point of view in many countries. Thus, organic-based biodegradable materials have attracted much interest in the composite industry because of environmental concerns. As a result of this, it has been asked that inorganic and petroleum-based materials should be reduced and altered with biodegradable materials. In this point, in this study, it is aimed to investigate the potential of the use of TEMPO (2,2,6,6- tetramethylpiperidine 1-oxyl)-mediated oxidation nano-fibrillated cellulose instead of EPDM (ethylene-propylene-diene monomer) rubber, which is a petroleum-based material. Thus, the exchange of petroleum-based EPDM rubber with organic-based cellulose nanofibers, which are environmentally friendly (green) and biodegradable, will be realized. The effect of tempo-oxidized cellulose nanofibers (TCNF) instead of EPDM rubber was analyzed by rheological, mechanical, chemical, thermal, and aging analyses. The aged surfaces were visually scrutinized, and surface morphological changes were examined via scanning electron microscopy (SEM). The results obtained showed that TEMPO oxidation nano-fibrillated cellulose could be used at an amount of 1.0 and 2.2 phr resulting the values stay within tolerance according to customer standard and without any chemical degradation, crack, color change or staining.Keywords: EPDM, lignin, green materials, biodegradable fillers
Procedia PDF Downloads 13424054 Geographic Information Systems and Remotely Sensed Data for the Hydrological Modelling of Mazowe Dam
Authors: Ellen Nhedzi Gozo
Abstract:
Unavailability of adequate hydro-meteorological data has always limited the analysis and understanding of hydrological behaviour of several dam catchments including Mazowe Dam in Zimbabwe. The problem of insufficient data for Mazowe Dam catchment analysis was solved by extracting catchment characteristics and aerial hydro-meteorological data from ASTER, LANDSAT, Shuttle Radar Topographic Mission SRTM remote sensing (RS) images using ILWIS, ArcGIS and ERDAS Imagine geographic information systems (GIS) software. Available observed hydrological as well as meteorological data complemented the use of the remotely sensed information. Ground truth land cover was mapped using a Garmin Etrex global positioning system (GPS) system. This information was then used to validate land cover classification detail that was obtained from remote sensing images. A bathymetry survey was conducted using a SONAR system connected to GPS. Hydrological modelling using the HBV model was then performed to simulate the hydrological process of the catchment in an effort to verify the reliability of the derived parameters. The model output shows a high Nash-Sutcliffe Coefficient that is close to 1 indicating that the parameters derived from remote sensing and GIS can be applied with confidence in the analysis of Mazowe Dam catchment.Keywords: geographic information systems, hydrological modelling, remote sensing, water resources management
Procedia PDF Downloads 34224053 A Bayesian Model with Improved Prior in Extreme Value Problems
Authors: Eva L. Sanjuán, Jacinto Martín, M. Isabel Parra, Mario M. Pizarro
Abstract:
In Extreme Value Theory, inference estimation for the parameters of the distribution is made employing a small part of the observation values. When block maxima values are taken, many data are discarded. We developed a new Bayesian inference model to seize all the information provided by the data, introducing informative priors and using the relations between baseline and limit parameters. Firstly, we studied the accuracy of the new model for three baseline distributions that lead to a Gumbel extreme distribution: Exponential, Normal and Gumbel. Secondly, we considered mixtures of Normal variables, to simulate practical situations when data do not adjust to pure distributions, because of perturbations (noise).Keywords: bayesian inference, extreme value theory, Gumbel distribution, highly informative prior
Procedia PDF Downloads 20124052 Quantitative, Preservative Methodology for Review of Interview Transcripts Using Natural Language Processing
Authors: Rowan P. Martnishn
Abstract:
During the execution of a National Endowment of the Arts grant, approximately 55 interviews were collected from professionals across various fields. These interviews were used to create deliverables – historical connections for creations that began as art and evolved entirely into computing technology. With dozens of hours’ worth of transcripts to be analyzed by qualitative coders, a quantitative methodology was created to sift through the documents. The initial step was to both clean and format all the data. First, a basic spelling and grammar check was applied, as well as a Python script for normalized formatting which used an open-source grammatical formatter to make the data as coherent as possible. 10 documents were randomly selected to manually review, where words often incorrectly translated during the transcription were recorded and replaced throughout all other documents. Then, to remove all banter and side comments, the transcripts were spliced into paragraphs (separated by change in speaker) and all paragraphs with less than 300 characters were removed. Secondly, a keyword extractor, a form of natural language processing where significant words in a document are selected, was run on each paragraph for all interviews. Every proper noun was put into a data structure corresponding to that respective interview. From there, a Bidirectional and Auto-Regressive Transformer (B.A.R.T.) summary model was then applied to each paragraph that included any of the proper nouns selected from the interview. At this stage the information to review had been sent from about 60 hours’ worth of data to 20. The data was further processed through light, manual observation – any summaries which proved to fit the criteria of the proposed deliverable were selected, as well their locations within the document. This narrowed that data down to about 5 hours’ worth of processing. The qualitative researchers were then able to find 8 more connections in addition to our previous 4, exceeding our minimum quota of 3 to satisfy the grant. Major findings of the study and subsequent curation of this methodology raised a conceptual finding crucial to working with qualitative data of this magnitude. In the use of artificial intelligence there is a general trade off in a model between breadth of knowledge and specificity. If the model has too much knowledge, the user risks leaving out important data (too general). If the tool is too specific, it has not seen enough data to be useful. Thus, this methodology proposes a solution to this tradeoff. The data is never altered outside of grammatical and spelling checks. Instead, the important information is marked, creating an indicator of where the significant data is without compromising the purity of it. Secondly, the data is chunked into smaller paragraphs, giving specificity, and then cross-referenced with the keywords (allowing generalization over the whole document). This way, no data is harmed, and qualitative experts can go over the raw data instead of using highly manipulated results. Given the success in deliverable creation as well as the circumvention of this tradeoff, this methodology should stand as a model for synthesizing qualitative data while maintaining its original form.Keywords: B.A.R.T.model, keyword extractor, natural language processing, qualitative coding
Procedia PDF Downloads 3424051 A Study of Cavity Quantum States Induced by Cavity-Matter Coupling Using Negativity in the Wigner Distribution
Authors: Anneswa Paul, Upendra Harbola
Abstract:
Interaction between light and matter is the primary tool to probe matter at the microscopic level. In recent years, light-matter interaction in optical cavity has found interesting applications in manipulating chemical reactions and material properties by modifying matter states in the cavity. However, not much attention has been given to study modifications in the cavity-field states, which is the focus of study in this work. The classical to non-classical transition in the field state due to interaction with the matter inside the cavity is discussed. The effect of the initial state of the matter on the cavity states as well as the role of photon-fluctuations are explored by considering different initial states of the matter and the field. The results demonstrate that the initial states of the field and the matter play a significant role in generating non-classicality in the cavity-field state as quantified in terms of negativity in the (Wigner) phase-space distribution of the cavity. It is found that the coherences induced between different photon-number states due to the interaction always contribute to enhance the non-classicality, while populations may suppress or enhance it depending on the relative weight of the vacuum state over other states. An increased weight of the vacuum state diminishes the non-classicality. It is shown that the energy exchange takes place between different photon-number states in the cavity field while matter acts as the facilitating agent.Keywords: cavity QED, light-matter interaction, phase space methods, quantum optics
Procedia PDF Downloads 1324050 Culture and Commodification: A Study of William Gibson's the Bridge Trilogy
Authors: Aruna Bhat
Abstract:
Culture can be placed within the social structure that embodies both the creation of social groups, and the manner in which they interact with each other. As many critics have pointed out, culture in the Postmodern context has often been considered a commodity, and indeed it shares many attributes with commercial products. Popular culture follows many patterns of behavior derived from Economics, from the simple principle of supply and demand, to the creation of marketable demographics which fit certain criterion. This trend is exemplary visible in contemporary fiction, especially in contemporary science fiction; Cyberpunk fiction in particular which is an off shoot of pure science fiction. William Gibson is one such author who in his works portrays such a scenario, and in his The Bridge Trilogy he adds another level of interpretation to this state of affairs, by describing a world that is centered on industrialization of a new kind – that focuses around data in the cyberspace. In this new world, data has become the most important commodity, and man has become nothing but a nodal point in a vast ocean of raw data resulting into commodification of each thing including Culture. This paper will attempt to study the presence of above mentioned elements in William Gibson’s The Bridge Trilogy. The theories applied will be Postmodernism and Cultural studies.Keywords: culture, commodity, cyberpunk, data, postmodern
Procedia PDF Downloads 51124049 Impact of Safety and Quality Considerations of Housing Clients on the Construction Firms’ Intention to Adopt Quality Function Deployment: A Case of Construction Sector
Authors: Saif Ul Haq
Abstract:
The current study intends to examine the safety and quality considerations of clients of housing projects and their impact on the adoption of Quality Function Deployment (QFD) by the construction firm. Mixed method research technique has been used to collect and analyze the data wherein a survey was conducted to collect the data from 220 clients of housing projects in Saudi Arabia. Then, the telephonic and Skype interviews were conducted to collect data of 15 professionals working in the top ten real estate companies of Saudi Arabia. Data were analyzed by using partial least square (PLS) and thematic analysis techniques. Findings reveal that today’s customer prioritizes the safety and quality requirements of their houses and as a result, construction firms adopt QFD to address the needs of customers. The findings are of great importance for the clients of housing projects as well as for the construction firms as they could apply QFD in housing projects to address the safety and quality concerns of their clients.Keywords: construction industry, quality considerations, quality function deployment, safety considerations
Procedia PDF Downloads 12724048 Customers’ Acceptability of Islamic Banking: Employees’ Perspective in Peshawar
Authors: Tahira Imtiaz, Karim Ullah
Abstract:
This paper aims to incorporate the banks employees’ perspective on acceptability of Islamic banking by the customers of Peshawar. A qualitative approach is adopted for which six in-depth interviews with employees of Islamic banks are conducted. The employees were asked to share their experience regarding customers’ acceptance attitude towards acceptability of Islamic banking. Collected data was analyzed through thematic analysis technique and its synthesis with the current literature. Through data analysis a theoretical framework is developed, which highlights the factors which drive customers towards Islamic banking, as witnessed by the employees. The practical implication of analyzed data evident that a new model could be developed on the basis of four determinants of human preference namely: inner satisfaction, time, faith and market forces.Keywords: customers’ attraction, employees’ perspective, Islamic banking, Riba
Procedia PDF Downloads 33624047 Customized Design of Amorphous Solids by Generative Deep Learning
Authors: Yinghui Shang, Ziqing Zhou, Rong Han, Hang Wang, Xiaodi Liu, Yong Yang
Abstract:
The design of advanced amorphous solids, such as metallic glasses, with targeted properties through artificial intelligence signifies a paradigmatic shift in physical metallurgy and materials technology. Here, we developed a machine-learning architecture that facilitates the generation of metallic glasses with targeted multifunctional properties. Our architecture integrates the state-of-the-art unsupervised generative adversarial network model with supervised models, allowing the incorporation of general prior knowledge derived from thousands of data points across a vast range of alloy compositions, into the creation of data points for a specific type of composition, which overcame the common issue of data scarcity typically encountered in the design of a given type of metallic glasses. Using our generative model, we have successfully designed copper-based metallic glasses, which display exceptionally high hardness or a remarkably low modulus. Notably, our architecture can not only explore uncharted regions in the targeted compositional space but also permits self-improvement after experimentally validated data points are added to the initial dataset for subsequent cycles of data generation, hence paving the way for the customized design of amorphous solids without human intervention.Keywords: metallic glass, artificial intelligence, mechanical property, automated generation
Procedia PDF Downloads 6624046 R Data Science for Technology Management
Authors: Sunghae Jun
Abstract:
Technology management (TM) is important issue in a company improving the competitiveness. Among many activities of TM, technology analysis (TA) is important factor, because most decisions for management of technology are decided by the results of TA. TA is to analyze the developed results of target technology using statistics or Delphi. TA based on Delphi is depended on the experts’ domain knowledge, in comparison, TA by statistics and machine learning algorithms use objective data such as patent or paper instead of the experts’ knowledge. Many quantitative TA methods based on statistics and machine learning have been studied, and these have been used for technology forecasting, technological innovation, and management of technology. They applied diverse computing tools and many analytical methods case by case. It is not easy to select the suitable software and statistical method for given TA work. So, in this paper, we propose a methodology for quantitative TA using statistical computing software called R and data science to construct a general framework of TA. From the result of case study, we also show how our methodology is applied to real field. This research contributes to R&D planning and technology valuation in TM areas.Keywords: technology management, R system, R data science, statistics, machine learning
Procedia PDF Downloads 46024045 Impact of Wastewater from Outfalls of River Ganga on Germination Percentage and Growth Parameters of Bitter Gourd (Momordica charantia L.) with Antioxidant Activity Study
Authors: Sayanti Kar, Amitava Ghosh, Pritam Aitch, Gupinath Bhandari
Abstract:
An extensive seasonal analysis of wastewater had been done from outfalls of river Ganga in Howrah, Hooghly, 24 PGS (N) District, West Bengal, India during 2017. The morphological parameters of Bitter gourd (Momordica charantia L.) were estimated under wastewater treatment. An approach to study the activity within the range of low molecular weight peptide 3-0.5 kDa were taken through its extraction and purification by ion exchange resin column, cation, and anion exchanger. HPLC analysis had been done for both in wastewater treated and untreated plants. The antioxidant activity by using DPPH and germination percentage in control and treated plants were also determined in relation to wastewater effect. The inhibition of growth and its parameters were maximum in pre-monsoon in comparing to post-monsoon and monsoon season. The study also helped to explore the effect of wastewater on the peptidome of Bitter gourd (Momordica charantia L.). Some of these low molecular weight peptide(s) (3-0.5 kDa) also inhibited during wastewater treatment. Expression of particular peptide(s) or absence of some peptide(s) in chromatogram indicated the adverse effects on plants which may be the indication of stressful condition. Pre monsoon waste water was found to create more impact than other two.Keywords: bitter gourd (Momordica charantia l.), low molecular weight peptide, river ganga, waste water
Procedia PDF Downloads 12924044 Closed Greenhouse Production Systems for Smart Plant Production in Urban Areas
Authors: U. Schmidt, D. Dannehl, I. Schuch, J. Suhl, T. Rocksch, R. Salazar-Moreno, E. Fitz-Rodrigues, A. Rojano Aquilar, I. Lopez Cruz, G. Navas Gomez, R. A. Abraham, L. C. Irineo, N. G. Gilberto
Abstract:
The integration of agricultural production systems into urban areas is a challenge for the coming decades. Because of increasing greenhouse gas emission and rising resource consumption as well as costs in animal husbandry, the dietary habits of people in the 21st century have to focus on herbal foods. Intensive plant cultivation systems in large cities and megacities require a smart coupling of information, material and energy flow with the urban infrastructure in terms of Horticulture 4.0. In recent years, many puzzle pieces have been developed for these closed processes at the Humboldt University. To compile these for an urban plant production, it has to be optimized and networked with urban infrastructure systems. In the field of heat energy production, it was shown that with closed greenhouse technology and patented heat exchange and storage technology energy can be provided for heating and domestic hot water supply in the city. Closed water circuits can be drastically reducing the water requirements of plant production in urban areas. Ion sensitive sensors and new disinfection methods can help keep circulating nutrient solutions in the system for a longer time in urban plant production greenhouses.Keywords: semi closed, greenhouses, urban farming, solar heat collector, closed water cycles, aquaponics
Procedia PDF Downloads 33524043 Mixture statistical modeling for predecting mortality human immunodeficiency virus (HIV) and tuberculosis(TB) infection patients
Authors: Mohd Asrul Affendi Bi Abdullah, Nyi Nyi Naing
Abstract:
The purpose of this study was to identify comparable manner between negative binomial death rate (NBDR) and zero inflated negative binomial death rate (ZINBDR) with died patients with (HIV + T B+) and (HIV + T B−). HIV and TB is a serious world wide problem in the developing country. Data were analyzed with applying NBDR and ZINBDR to make comparison which a favorable model is better to used. The ZINBDR model is able to account for the disproportionately large number of zero within the data and is shown to be a consistently better fit than the NBDR model. Hence, as a results ZINBDR model is a superior fit to the data than the NBDR model and provides additional information regarding the died mechanisms HIV+TB. The ZINBDR model is shown to be a use tool for analysis death rate according age categorical.Keywords: zero inflated negative binomial death rate, HIV and TB, AIC and BIC, death rate
Procedia PDF Downloads 43724042 Efficient Reuse of Exome Sequencing Data for Copy Number Variation Callings
Authors: Chen Wang, Jared Evans, Yan Asmann
Abstract:
With the quick evolvement of next-generation sequencing techniques, whole-exome or exome-panel data have become a cost-effective way for detection of small exonic mutations, but there has been a growing desire to accurately detect copy number variations (CNVs) as well. In order to address this research and clinical needs, we developed a sequencing coverage pattern-based method not only for copy number detections, data integrity checks, CNV calling, and visualization reports. The developed methodologies include complete automation to increase usability, genome content-coverage bias correction, CNV segmentation, data quality reports, and publication quality images. Automatic identification and removal of poor quality outlier samples were made automatically. Multiple experimental batches were routinely detected and further reduced for a clean subset of samples before analysis. Algorithm improvements were also made to improve somatic CNV detection as well as germline CNV detection in trio family. Additionally, a set of utilities was included to facilitate users for producing CNV plots in focused genes of interest. We demonstrate the somatic CNV enhancements by accurately detecting CNVs in whole exome-wide data from the cancer genome atlas cancer samples and a lymphoma case study with paired tumor and normal samples. We also showed our efficient reuses of existing exome sequencing data, for improved germline CNV calling in a family of the trio from the phase-III study of 1000 Genome to detect CNVs with various modes of inheritance. The performance of the developed method is evaluated by comparing CNV calling results with results from other orthogonal copy number platforms. Through our case studies, reuses of exome sequencing data for calling CNVs have several noticeable functionalities, including a better quality control for exome sequencing data, improved joint analysis with single nucleotide variant calls, and novel genomic discovery of under-utilized existing whole exome and custom exome panel data.Keywords: bioinformatics, computational genetics, copy number variations, data reuse, exome sequencing, next generation sequencing
Procedia PDF Downloads 26024041 Effect of Biochar, Farmyard Manure, and Lime on Soil Properties, and on Growth and Nutrient Uptake of Wheat on Acidic Soils in Southern Ethiopia
Authors: Mekdes Lulu
Abstract:
This study assessed the effect of the interactions of biochar (BC), farmyard manure (FYM) and lime on soil chemical properties and on different wheat attributes in Southern Ethiopia. The experimental design was a randomized complete block in three replications. The site significantly (p ≤ 0.05) influenced soil and wheat attributes. Biochar showed a large significant effect (p ≤ 0.05) on soil organic carbon, cation exchange capacity, and exchangeable potassium (K), while lime showed a substantially significant (p ≤ 0.05) effect on exchangeable Calcium (Ca) and acidity. Farmyard manure (10 tonnes ha−1 ) had a significant effect on soil total nitrogen (TN). Biochar and lime showed a large significant effect on soil pH and available phosphorus (P) depending on the site. All amendments showed a significant (p ≤ 0.001) effect on most wheat attributes, but the highest effect was from BC. Biochar produced highly significant (p ≤ 0.001) effects on plant height, total number of tillers and productive tillers, number of seeds per spike, aboveground biomass, grain yield, and P and K content in wheat grain and straw. We accredited the greater effect of BC on wheat attributes to its influence on soil chemical properties. We recommend long-term studies on the impact of BC alone or in combination with FYM on acid soil types.Keywords: grain yield, soil amendments, soil nutrients, soil organic carbon, Triticum aestivum
Procedia PDF Downloads 3924040 [Keynote]: No-Trust-Zone Architecture for Securing Supervisory Control and Data Acquisition
Authors: Michael Okeke, Andrew Blyth
Abstract:
Supervisory Control And Data Acquisition (SCADA) as the state of the art Industrial Control Systems (ICS) are used in many different critical infrastructures, from smart home to energy systems and from locomotives train system to planes. Security of SCADA systems is vital since many lives depend on it for daily activities and deviation from normal operation could be disastrous to the environment as well as lives. This paper describes how No-Trust-Zone (NTZ) architecture could be incorporated into SCADA Systems in order to reduce the chances of malicious intent. The architecture is made up of two distinctive parts which are; the field devices such as; sensors, PLCs pumps, and actuators. The second part of the architecture is designed following lambda architecture, which is made up of a detection algorithm based on Particle Swarm Optimization (PSO) and Hadoop framework for data processing and storage. Apache Spark will be a part of the lambda architecture for real-time analysis of packets for anomalies detection.Keywords: industrial control system (ics, no-trust-zone (ntz), particle swarm optimisation (pso), supervisory control and data acquisition (scada), swarm intelligence (SI)
Procedia PDF Downloads 34624039 A Study on the Correlation Analysis between the Pre-Sale Competition Rate and the Apartment Unit Plan Factor through Machine Learning
Authors: Seongjun Kim, Jinwooung Kim, Sung-Ah Kim
Abstract:
The development of information and communication technology also affects human cognition and thinking, especially in the field of design, new techniques are being tried. In architecture, new design methodologies such as machine learning or data-driven design are being applied. In particular, these methodologies are used in analyzing the factors related to the value of real estate or analyzing the feasibility in the early planning stage of the apartment housing. However, since the value of apartment buildings is often determined by external factors such as location and traffic conditions, rather than the interior elements of buildings, data is rarely used in the design process. Therefore, although the technical conditions are provided, the internal elements of the apartment are difficult to apply the data-driven design in the design process of the apartment. As a result, the designers of apartment housing were forced to rely on designer experience or modular design alternatives rather than data-driven design at the design stage, resulting in a uniform arrangement of space in the apartment house. The purpose of this study is to propose a methodology to support the designers to design the apartment unit plan with high consumer preference by deriving the correlation and importance of the floor plan elements of the apartment preferred by the consumers through the machine learning and reflecting this information from the early design process. The data on the pre-sale competition rate and the elements of the floor plan are collected as data, and the correlation between pre-sale competition rate and independent variables is analyzed through machine learning. This analytical model can be used to review the apartment unit plan produced by the designer and to assist the designer. Therefore, it is possible to make a floor plan of apartment housing with high preference because it is possible to feedback apartment unit plan by using trained model when it is used in floor plan design of apartment housing.Keywords: apartment unit plan, data-driven design, design methodology, machine learning
Procedia PDF Downloads 26924038 Nonparametric Truncated Spline Regression Model on the Data of Human Development Index in Indonesia
Authors: Kornelius Ronald Demu, Dewi Retno Sari Saputro, Purnami Widyaningsih
Abstract:
Human Development Index (HDI) is a standard measurement for a country's human development. Several factors may have influenced it, such as life expectancy, gross domestic product (GDP) based on the province's annual expenditure, the number of poor people, and the percentage of an illiterate people. The scatter plot between HDI and the influenced factors show that the plot does not follow a specific pattern or form. Therefore, the HDI's data in Indonesia can be applied with a nonparametric regression model. The estimation of the regression curve in the nonparametric regression model is flexible because it follows the shape of the data pattern. One of the nonparametric regression's method is a truncated spline. Truncated spline regression is one of the nonparametric approach, which is a modification of the segmented polynomial functions. The estimator of a truncated spline regression model was affected by the selection of the optimal knots point. Knot points is a focus point of spline truncated functions. The optimal knots point was determined by the minimum value of generalized cross validation (GCV). In this article were applied the data of Human Development Index with a truncated spline nonparametric regression model. The results of this research were obtained the best-truncated spline regression model to the HDI's data in Indonesia with the combination of optimal knots point 5-5-5-4. Life expectancy and the percentage of an illiterate people were the significant factors depend to the HDI in Indonesia. The coefficient of determination is 94.54%. This means the regression model is good enough to applied on the data of HDI in Indonesia.Keywords: generalized cross validation (GCV), Human Development Index (HDI), knots point, nonparametric regression, truncated spline
Procedia PDF Downloads 34824037 Impact of Protean Career Attitude on Career Success with the Mediating Effect of Career Insight
Authors: Prabhashini Wijewantha
Abstract:
This study looks at the impact of protean career attitude of employees on their career success and next it looks at the mediation effect of career insights on the above relationship. Career success is defined as the accomplishment of desirable work related outcomes at any point in person’s work experiences over time and it comprises of two sub variables, namely, career satisfaction and perceived employability. Protean career attitude was measured using the eight items from the Self Directedness subscale of the Protean Career Attitude scale developed by Briscoe and Hall, where as career satisfaction was measured by the three item scale developed by Martine, Eddleston, and Veiga. Perceived employability was also evaluated using three items and career insight was measured using fourteen items that were adapted and used by De Vos and Soens. Data were collected from a sample of 300 mid career executives in Sri Lanka deploying the survey strategy and data were analyzed using the SPSS and AMOS software version 20.0. A preliminary analysis of data was initially performed where data were screened and reliability and validity were ensured. Next a simple regression analysis was performed to test the direct impact of protean career attitude on career success and the hypothesis was supported. The Baron and Kenney’s four steps, three regressions approach for mediator testing was used to calculate the mediation effect of career insight on the above relationship and a partial mediation was supported by the data. Finally theoretical and practical implications are discussed.Keywords: career success, career insight, mid career MBAs, protean career attitude
Procedia PDF Downloads 362