Search results for: regional data
23755 India, Pakistan and the US in the Afghan Imbroglio: The Way Forward
Authors: Saroj Kumar Rath
Abstract:
When insurgency erupted in Kashmir in 1989, it was quickly backed by Pakistan. Kashmir witnessed terrorism for more than a decade till 2004 when Indian forces decimated militancy. After the US pressure in 1992, terrorist training camps of Pakistan shifted to Afghanistan and al Qaeda and the Taliban had taken over training of Kashmiri militants in Afghanistan after 1997 as part of their global jihad. The Indo-Pak rivalry over Kashmir dispute had taken a new turn in the aftermath of 9/11 developments. Islamabad viewed its Afghan policy through the prism of denying India any advantage in Kabul. Pakistan was successful in refuting Indian presence in Kabul for a decade through the Taliban. After the 9/11 attacks the Inter Services Intelligence (ISI) saw Northern Alliance, supported by the Americans and all of Pakistan’s regional rivals – India, Iran, and Russia – as claiming victory in Kabul. For Pakistan’s military regime, this was a strategic disaster and prompted the ISI to give refuge to the escaping Taliban, while denying full support to Hamid Karzai. The new development in Afghanistan prompted India to establish a foothold it had lost nearly a decade earlier. India established diplomatic contacts with Afghanistan; supported the Karzai government and funded aid programs. Pakistan alleged that Indian agents are training Baloch and Sindhi dissidents in Pakistan through Afghanistan. Kabul had suddenly become the new Kashmir – the new battleground for India-Pakistan rivalry.Keywords: Afghan imbroglio, Kashmir conflict, Indo-Pak rivalry, US policy in South Asia
Procedia PDF Downloads 43223754 Using Manipulating Urban Layouts to Enhance Ventilation and Thermal Comfort in Street Canyons
Authors: Su Ying-Ming
Abstract:
High density of high rise buildings in urban areas lead to a deteriorative Urban Heat Island Effect, gradually. This study focuses on discussing the relationship between urban layout and ventilation comfort in street canyons. This study takes Songjiang Nanjing Rd. area of Taipei, Taiwan as an example to evaluate the wind environment comfort index by field measurement and Computational Fluid Dynamics (CFD) to improve both the quality and quantity of the environment. In this study, different factors including street blocks size, the width of buildings, street width ratio and the direction of the wind were used to discuss the potential of ventilation. The environmental wind field was measured by the environmental testing equipment, Testo 480. Evaluation of blocks sizes, the width of buildings, street width ratio and the direction of the wind was made under the condition of constant floor area with the help of Stimulation CFD to adjust research methods for optimizing regional wind environment. The results of this study showed the width of buildings influences the efficiency of outdoor ventilation; improvement of the efficiency of ventilation with large street width was also shown. The study found that Block width and H/D value and PR value has a close relationship. Furthermore, this study showed a significant relationship between the alteration of street block geometry and outdoor comfortableness.Keywords: urban ventilation path, ventilation efficiency indices, CFD, building layout
Procedia PDF Downloads 38223753 Automatic Thresholding for Data Gap Detection for a Set of Sensors in Instrumented Buildings
Authors: Houda Najeh, Stéphane Ploix, Mahendra Pratap Singh, Karim Chabir, Mohamed Naceur Abdelkrim
Abstract:
Building systems are highly vulnerable to different kinds of faults and failures. In fact, various faults, failures and human behaviors could affect the building performance. This paper tackles the detection of unreliable sensors in buildings. Different literature surveys on diagnosis techniques for sensor grids in buildings have been published but all of them treat only bias and outliers. Occurences of data gaps have also not been given an adequate span of attention in the academia. The proposed methodology comprises the automatic thresholding for data gap detection for a set of heterogeneous sensors in instrumented buildings. Sensor measurements are considered to be regular time series. However, in reality, sensor values are not uniformly sampled. So, the issue to solve is from which delay each sensor become faulty? The use of time series is required for detection of abnormalities on the delays. The efficiency of the method is evaluated on measurements obtained from a real power plant: an office at Grenoble Institute of technology equipped by 30 sensors.Keywords: building system, time series, diagnosis, outliers, delay, data gap
Procedia PDF Downloads 24323752 Artificial Reproduction System and Imbalanced Dataset: A Mendelian Classification
Authors: Anita Kushwaha
Abstract:
We propose a new evolutionary computational model called Artificial Reproduction System which is based on the complex process of meiotic reproduction occurring between male and female cells of the living organisms. Artificial Reproduction System is an attempt towards a new computational intelligence approach inspired by the theoretical reproduction mechanism, observed reproduction functions, principles and mechanisms. A reproductive organism is programmed by genes and can be viewed as an automaton, mapping and reducing so as to create copies of those genes in its off springs. In Artificial Reproduction System, the binding mechanism between male and female cells is studied, parameters are chosen and a network is constructed also a feedback system for self regularization is established. The model then applies Mendel’s law of inheritance, allele-allele associations and can be used to perform data analysis of imbalanced data, multivariate, multiclass and big data. In the experimental study Artificial Reproduction System is compared with other state of the art classifiers like SVM, Radial Basis Function, neural networks, K-Nearest Neighbor for some benchmark datasets and comparison results indicates a good performance.Keywords: bio-inspired computation, nature- inspired computation, natural computing, data mining
Procedia PDF Downloads 27223751 Critical Evaluation and Analysis of Effects of Different Queuing Disciplines on Packets Delivery and Delay for Different Applications
Authors: Omojokun Gabriel Aju
Abstract:
Communication network is a process of exchanging data between two or more devices via some forms of transmission medium using communication protocols. The data could be in form of text, images, audio, video or numbers which can be grouped into FTP, Email, HTTP, VOIP or Video applications. The effectiveness of such data exchange will be proved if they are accurately delivered within specified time. While some senders will not really mind when the data is actually received by the receiving device, inasmuch as it is acknowledged to have been received by the receiver. The time a data takes to get to a receiver could be very important to another sender, as any delay could cause serious problem or even in some cases rendered the data useless. The validity or invalidity of a data after delay will therefore definitely depend on the type of data (information). It is therefore imperative for the network device (such as router) to be able to differentiate among the packets which are time sensitive and those that are not, when they are passing through the same network. So, here is where the queuing disciplines comes to play, to handle network resources when such network is designed to service widely varying types of traffics and manage the available resources according to the configured policies. Therefore, as part of the resources allocation mechanisms, a router within the network must implement some queuing discipline that governs how packets (data) are buffered while waiting to be transmitted. The implementation of the queuing discipline will regulate how the packets are buffered while waiting to be transmitted. In achieving this, various queuing disciplines are being used to control the transmission of these packets, by determining which of the packets get the highest priority, less priority and which packets are dropped. The queuing discipline will therefore control the packets latency by determining how long a packet can wait to be transmitted or dropped. The common queuing disciplines are first-in-first-out queuing, Priority queuing and Weighted-fair queuing (FIFO, PQ and WFQ). This paper critically evaluates and analyse through the use of Optimized Network Evaluation Tool (OPNET) Modeller, Version 14.5 the effects of three queuing disciplines (FIFO, PQ and WFQ) on the performance of 5 different applications (FTP, HTTP, E-Mail, Voice and Video) within specified parameters using packets sent, packets received and transmission delay as performance metrics. The paper finally suggests some ways in which networks can be designed to provide better transmission performance while using these queuing disciplines.Keywords: applications, first-in-first-out queuing (FIFO), optimised network evaluation tool (OPNET), packets, priority queuing (PQ), queuing discipline, weighted-fair queuing (WFQ)
Procedia PDF Downloads 35823750 Radiologic Assessment of Orbital Dimensions Among Omani Subjects: Computed Tomography Imaging-Based Study
Authors: Marwa Al-Subhi, Eiman Al-Ajmi, Mallak Al-Maamari, Humood Al-Dhuhli, Srinivasa Rao
Abstract:
The orbit and its contents are affected by various pathologies and craniofacial anomalies. Sound knowledge of the normal orbital dimensions is clinically essential for successful surgical outcomes and also in the field of forensic anthropology. Racial, ethnic, and regional variations in the orbital dimensions have been reported. This study sought to determine the orbital dimensions of Omani subjects who had been referred for computed tomography (CT) images at a tertiary care hospital. A total of 273 patients’ CT images were evaluated retrospectively by using an electronic medical records database. The orbital dimensions were recorded using both axial and sagittal planes of CT images. The mean orbital index (OI) was found to be 83.25±4.83 and the prevalent orbital type was categorized as mesoseme. The mean orbital index was 83.34±5.05 and 83.16±4.57 in males and females, respectively, with their difference being statistically not significant (p=0.76). A statistically significant association was observed between the right and left orbits with regard to horizontal distance (p<0.05) and vertical distance (p<0.01) of orbit and OI (p<0.05). No significant difference between the OI and age groups was observed in both males and females. The mean interorbital distance and interzygomatic distance were found to be 19.45±1.52 mm and 95.59±4.08 mm, respectively. Both of these parameters were significantly higher in males (p<0.05). Results of the present study provide reference values of orbital dimensions in Omani subjects. The prevalent orbital type of Omani subjects is mesoseme, which is a hallmark of the white race.Keywords: orbit, orbital index, mesoseme, ethnicity, variation
Procedia PDF Downloads 14723749 Data Confidentiality in Public Cloud: A Method for Inclusion of ID-PKC Schemes in OpenStack Cloud
Authors: N. Nalini, Bhanu Prakash Gopularam
Abstract:
The term data security refers to the degree of resistance or protection given to information from unintended or unauthorized access. The core principles of information security are the confidentiality, integrity and availability, also referred as CIA triad. Cloud computing services are classified as SaaS, IaaS and PaaS services. With cloud adoption the confidential enterprise data are moved from organization premises to untrusted public network and due to this the attack surface has increased manifold. Several cloud computing platforms like OpenStack, Eucalyptus, Amazon EC2 offer users to build and configure public, hybrid and private clouds. While the traditional encryption based on PKI infrastructure still works in cloud scenario, the management of public-private keys and trust certificates is difficult. The Identity based Public Key Cryptography (also referred as ID-PKC) overcomes this problem by using publicly identifiable information for generating the keys and works well with decentralized systems. The users can exchange information securely without having to manage any trust information. Another advantage is that access control (role based access control policy) information can be embedded into data unlike in PKI where it is handled by separate component or system. In OpenStack cloud platform the keystone service acts as identity service for authentication and authorization and has support for public key infrastructure for auto services. In this paper, we explain OpenStack security architecture and evaluate the PKI infrastructure piece for data confidentiality. We provide method to integrate ID-PKC schemes for securing data while in transit and stored and explain the key measures for safe guarding data against security attacks. The proposed approach uses JPBC crypto library for key-pair generation based on IEEE P1636.3 standard and secure communication to other cloud services.Keywords: data confidentiality, identity based cryptography, secure communication, open stack key stone, token scoping
Procedia PDF Downloads 38423748 Authenticity during Conflict Reporting: The China-India Border Clash in the Indian Press
Authors: Arjun Chatterjee
Abstract:
The India-China border clash in Galwan valley in June 2020, the first deadly skirmish between the two Asian giants in the Himalayan border area in over four decades, highlighted the need to examine the notion of ‘authenticity’ in journalistic practices. Information emanating from such remotely located, sparsely populated, and not well-demarcated international land borders have limited sources, restricted to official sources, which have their own narrative. Geopolitical goals and ambitions embolden narratives of nationalism in the media, and these often challenge the notion and understanding of authenticity in journalism. The Indian press, contrary to the Chinese press, which is state-owned, is diverse and also confrontational, where narratives of nationalism are differentially interpreted, embedded, and realised. This paper examines how authenticity has become a variable, rather than a constant, in conflict reporting of the Sino-Indian border clash and how authenticity is interpreted similarly or differently in conflict journalism. The paper reports qualitative textual analysis of two leading English language newspapers – The Times of India and The Hindu, and two mainstream regional language newspapers, Amar Ujala (Hindi) and Ananda Bazar Patrika (Bengali), to evaluate the ways in which representations of information function in conflict reporting and to recontextualize (and thus change or modify the meaning of) that which they represent, and with what political and cultural implications.Keywords: India-China, framing, conflict, media narratives, border dispute
Procedia PDF Downloads 9123747 Investigating the Relationship between Job Satisfaction, Role Identity, and Turnover Intention for Nurses in Outpatient Department
Authors: Su Hui Tsai, Weir Sen Lin, Rhay Hung Weng
Abstract:
There are numerous outpatient departments at hospitals with enormous amounts of outpatients. Although the work of outpatient nursing staff does not include the ward, emergency and critical care units that involve patient life-threatening conditions, the work is cumbersome and requires facing and dealing with a large number of outpatients in a short period of time. Therefore, nursing staff often do not feel satisfied with their work and cannot identify with their professional role, leading to intentions to leave their job. Thus, the main purpose of this study is to explore the correlation between the job satisfaction and role identity of nursing staff with turnover intention. This research was conducted using a questionnaire, and the subjects were outpatient nursing staff in three regional hospitals in Southern Taiwan. A total of 175 questionnaires were distributed, and 166 valid questionnaires were returned. After collecting the data, the reliability and validity of the study variables were confirmed by confirmatory factor analysis. The influence of role identity and job satisfaction on nursing staff’s turnover intention was analyzed by descriptive analysis, one-way ANOVA, Pearson correlation analysis and multiple regression analysis. Results showed that 'role identity' had significant differences in different types of marriages. Job satisfaction of 'grasp of environment' had significant differences in different levels of education. Job satisfaction of 'professional growth' and 'shifts and days off' showed significant differences in different types of marriages. 'Role identity' and 'job satisfaction' were negatively correlated with turnover intention respectively. Job satisfaction of 'salary and benefits' and 'grasp of environment' were significant predictors of role identity. The higher the job satisfaction of 'salary and benefits' and 'grasp of environment', the higher the role identity. Job satisfaction of 'patient and family interaction' were significant predictors of turnover intention. The lower the job satisfaction of 'patient and family interaction', the higher the turnover intention. This study found that outpatient nursing staff had the lowest satisfaction towards salary structure. It is recommended that bonuses, promotion opportunities and other incentives be established to increase the role identity of outpatient nursing staff. The results showed that the higher the job satisfaction of 'salary and benefits' and 'grasp of environment', the higher the role identity. It is recommended that regular evaluations be conducted to reward nursing staff with excellent service and invite nursing staff to share their work experiences and thoughts, to enhance nursing staff’s expectation and identification of their occupational role, as well as instilling the concept of organizational service and organizational expectations of emotional display. The results showed that the lower the job satisfaction of 'patient and family interaction', the higher the turnover intention. It is recommended that interpersonal communication and workplace violence prevention educational training courses be organized to enhance the communication and interaction of nursing staff with patients and their families.Keywords: outpatient, job satisfaction, turnover, intention
Procedia PDF Downloads 14523746 Improved Distance Estimation in Dynamic Environments through Multi-Sensor Fusion with Extended Kalman Filter
Authors: Iffat Ara Ebu, Fahmida Islam, Mohammad Abdus Shahid Rafi, Mahfuzur Rahman, Umar Iqbal, John Ball
Abstract:
The application of multi-sensor fusion for enhanced distance estimation accuracy in dynamic environments is crucial for advanced driver assistance systems (ADAS) and autonomous vehicles. Limitations of single sensors such as cameras or radar in adverse conditions motivate the use of combined camera and radar data to improve reliability, adaptability, and object recognition. A multi-sensor fusion approach using an extended Kalman filter (EKF) is proposed to combine sensor measurements with a dynamic system model, achieving robust and accurate distance estimation. The research utilizes the Mississippi State University Autonomous Vehicular Simulator (MAVS) to create a controlled environment for data collection. Data analysis is performed using MATLAB. Qualitative (visualization of fused data vs ground truth) and quantitative metrics (RMSE, MAE) are employed for performance assessment. Initial results with simulated data demonstrate accurate distance estimation compared to individual sensors. The optimal sensor measurement noise variance and plant noise variance parameters within the EKF are identified, and the algorithm is validated with real-world data from a Chevrolet Blazer. In summary, this research demonstrates that multi-sensor fusion with an EKF significantly improves distance estimation accuracy in dynamic environments. This is supported by comprehensive evaluation metrics, with validation transitioning from simulated to real-world data, paving the way for safer and more reliable autonomous vehicle control.Keywords: sensor fusion, EKF, MATLAB, MAVS, autonomous vehicle, ADAS
Procedia PDF Downloads 4123745 Novel Practices in Research and Innovation Management
Authors: A. Ravinder Nath, D. Jaya Prakash, T. Venkateshwarlu, P. Raja Rao
Abstract:
The introduction of novel practices in research and innovation management at the university are likely to make a real difference in improving the quality of life and boost the global competitiveness for sustainable economic growth. Establishment a specific institutional structure at the university level provides professional management and administrative expertise to the university’s research community by sourcing out funding opportunities, extending guidance in grant proposal preparation and submission and also assisting in the post award reporting and regulatory observance. In addition to these it can involve in negotiating fair and equitable research contracts. Further it administer research governance to provide support and encourage collaborations across all disciplines of the university with industry, government, community based organizations, foundations, and associations at the local, regional, national and international levels/scales. The partnerships in research and innovation are more powerful and far needed tools for knowledge-based economy, where the universities can offer the services of much wanted human resources to promote, foster, and sustain excellence in research. In addition to this the institutes provide amply desired infrastructure and expertise to work with the investigators, and the industry will generate required financial resources in a coordinated manner. Further it is possible to carryout high-end applied research and synergizes the research capabilities and professional skills of students, faculty, scientists, and industrial work force.Keywords: collaborations, competitiveness, contracts, governance
Procedia PDF Downloads 39523744 A User Identification Technique to Access Big Data Using Cloud Services
Authors: A. R. Manu, V. K. Agrawal, K. N. Balasubramanya Murthy
Abstract:
Authentication is required in stored database systems so that only authorized users can access the data and related cloud infrastructures. This paper proposes an authentication technique using multi-factor and multi-dimensional authentication system with multi-level security. The proposed technique is likely to be more robust as the probability of breaking the password is extremely low. This framework uses a multi-modal biometric approach and SMS to enforce additional security measures with the conventional Login/password system. The robustness of the technique is demonstrated mathematically using a statistical analysis. This work presents the authentication system along with the user authentication architecture diagram, activity diagrams, data flow diagrams, sequence diagrams, and algorithms.Keywords: design, implementation algorithms, performance, biometric approach
Procedia PDF Downloads 47323743 Input Data Balancing in a Neural Network PM-10 Forecasting System
Authors: Suk-Hyun Yu, Heeyong Kwon
Abstract:
Recently PM-10 has become a social and global issue. It is one of major air pollutants which affect human health. Therefore, it needs to be forecasted rapidly and precisely. However, PM-10 comes from various emission sources, and its level of concentration is largely dependent on meteorological and geographical factors of local and global region, so the forecasting of PM-10 concentration is very difficult. Neural network model can be used in the case. But, there are few cases of high concentration PM-10. It makes the learning of the neural network model difficult. In this paper, we suggest a simple input balancing method when the data distribution is uneven. It is based on the probability of appearance of the data. Experimental results show that the input balancing makes the neural networks’ learning easy and improves the forecasting rates.Keywords: artificial intelligence, air quality prediction, neural networks, pattern recognition, PM-10
Procedia PDF Downloads 22923742 Metabolic Predictive Model for PMV Control Based on Deep Learning
Authors: Eunji Choi, Borang Park, Youngjae Choi, Jinwoo Moon
Abstract:
In this study, a predictive model for estimating the metabolism (MET) of human body was developed for the optimal control of indoor thermal environment. Human body images for indoor activities and human body joint coordinated values were collected as data sets, which are used in predictive model. A deep learning algorithm was used in an initial model, and its number of hidden layers and hidden neurons were optimized. Lastly, the model prediction performance was analyzed after the model being trained through collected data. In conclusion, the possibility of MET prediction was confirmed, and the direction of the future study was proposed as developing various data and the predictive model.Keywords: deep learning, indoor quality, metabolism, predictive model
Procedia PDF Downloads 25523741 Analysis of Brownfield Soil Contamination Using Local Government Planning Data
Authors: Emma E. Hellawell, Susan J. Hughes
Abstract:
BBrownfield sites are currently being redeveloped for residential use. Information on soil contamination on these former industrial sites is collected as part of the planning process by the local government. This research project analyses this untapped resource of environmental data, using site investigation data submitted to a local Borough Council, in Surrey, UK. Over 150 site investigation reports were collected and interrogated to extract relevant information. This study involved three phases. Phase 1 was the development of a database for soil contamination information from local government reports. This database contained information on the source, history, and quality of the data together with the chemical information on the soil that was sampled. Phase 2 involved obtaining site investigation reports for development within the study area and extracting the required information for the database. Phase 3 was the data analysis and interpretation of key contaminants to evaluate typical levels of contaminants, their distribution within the study area, and relating these results to current guideline levels of risk for future site users. Preliminary results for a pilot study using a sample of the dataset have been obtained. This pilot study showed there is some inconsistency in the quality of the reports and measured data, and careful interpretation of the data is required. Analysis of the information has found high levels of lead in shallow soil samples, with mean and median levels exceeding the current guidance for residential use. The data also showed elevated (but below guidance) levels of potentially carcinogenic polyaromatic hydrocarbons. Of particular concern from the data was the high detection rate for asbestos fibers. These were found at low concentrations in 25% of the soil samples tested (however, the sample set was small). Contamination levels of the remaining chemicals tested were all below the guidance level for residential site use. These preliminary pilot study results will be expanded, and results for the whole local government area will be presented at the conference. The pilot study has demonstrated the potential for this extensive dataset to provide greater information on local contamination levels. This can help inform regulators and developers and lead to more targeted site investigations, improving risk assessments, and brownfield development.Keywords: Brownfield development, contaminated land, local government planning data, site investigation
Procedia PDF Downloads 13623740 Carbon Footprint Assessment Initiative and Trees: Role in Reducing Emissions
Authors: Omar Alelweet
Abstract:
Carbon emissions are quantified in terms of carbon dioxide equivalents, generated through a specific activity or accumulated throughout the life stages of a product or service. Given the growing concern about climate change and the role of carbon dioxide emissions in global warming, this initiative aims to create awareness and understanding of the impact of human activities and identify potential areas for improvement regarding the management of the carbon footprint on campus. Given that trees play a vital role in reducing carbon emissions by absorbing CO₂ during the photosynthesis process, this paper evaluated the contribution of each tree to reducing those emissions. Collecting data over an extended period of time is essential to monitoring carbon dioxide levels. This will help capture changes at different times and identify any patterns or trends in the data. By linking the data to specific activities, events, or environmental factors, it is possible to identify sources of emissions and areas where carbon dioxide levels are rising. Analyzing the collected data can provide valuable insights into ways to reduce emissions and mitigate the impact of climate change.Keywords: sustainability, green building, environmental impact, CO₂
Procedia PDF Downloads 6823739 Detection of Change Points in Earthquakes Data: A Bayesian Approach
Authors: F. A. Al-Awadhi, D. Al-Hulail
Abstract:
In this study, we applied the Bayesian hierarchical model to detect single and multiple change points for daily earthquake body wave magnitude. The change point analysis is used in both backward (off-line) and forward (on-line) statistical research. In this study, it is used with the backward approach. Different types of change parameters are considered (mean, variance or both). The posterior model and the conditional distributions for single and multiple change points are derived and implemented using BUGS software. The model is applicable for any set of data. The sensitivity of the model is tested using different prior and likelihood functions. Using Mb data, we concluded that during January 2002 and December 2003, three changes occurred in the mean magnitude of Mb in Kuwait and its vicinity.Keywords: multiple change points, Markov Chain Monte Carlo, earthquake magnitude, hierarchical Bayesian mode
Procedia PDF Downloads 45523738 Empowerment Model: A Strategy for Supporting Creative Economy through Traditional Weaving in Anajiaka Village
Authors: Sita Yuliastuti Amijaya, Wiyatiningsih Wiyatiningsih, Paulus Bawole
Abstract:
Weaving skills were not originally a way to earn money for the traditional people on Sumba Island. Weaving is a leisure activity carried out between farming and caring for families. It is quite understandable if the weavers are women. At this time, weaving crafts become a unique potential inherent in an area, so that the weaver women also have the potential to drive economic activity in regional tourism sector. This study aims to measure the sustainability of traditional weaving business activities in Anajiaka Village, Umbu Ratu Nggay Barat, Central Sumba Regency, which is able to support the creative economy. The analysis was performed using qualitative descriptive methods by comparing the criteria of smart living and smart economy in the study of smart city. This study found that business sustainability will be better maintained if it is bound in a joint commitment, for example by forming a group of craftsmen. Other challenges besides the commitment of the group members are aspects of local government support and related agencies, in the form of guidance, funding, and promotion. In addition, fabric order targets, maintaining family and community balance, are recognized as obstacles for craftsmen. The modern marketing model is not yet mastered by the craftsmen group, so it needs assistance for future development.Keywords: agriculture, craftsmen, creativepreneur, smart economy, smart living
Procedia PDF Downloads 16423737 Productivity and Structural Design of Manufacturing Systems
Authors: Ryspek Usubamatov, Tan San Chin, Sarken Kapaeva
Abstract:
Productivity of the manufacturing systems depends on technological processes, a technical data of machines and a structure of systems. Technology is presented by the machining mode and data, a technical data presents reliability parameters and auxiliary time for discrete production processes. The term structure of manufacturing systems includes the number of serial and parallel production machines and links between them. Structures of manufacturing systems depend on the complexity of technological processes. Mathematical models of productivity rate for manufacturing systems are important attributes that enable to define best structure by criterion of a productivity rate. These models are important tool in evaluation of the economical efficiency for production systems.Keywords: productivity, structure, manufacturing systems, structural design
Procedia PDF Downloads 58023736 The Effect of Tacit Knowledge for Intelligence Cycle
Authors: Bahadir Aydin
Abstract:
It is difficult to access accurate knowledge because of mass data. This huge data make environment more and more caotic. Data are main piller of intelligence. The affiliation between intelligence and knowledge is quite significant to understand underlying truths. The data gathered from different sources can be modified, interpreted and classified by using intelligence cycle process. This process is applied in order to progress to wisdom as well as intelligence. Within this process the effect of tacit knowledge is crucial. Knowledge which is classified as explicit and tacit knowledge is the key element for any purpose. Tacit knowledge can be seen as "the tip of the iceberg”. This tacit knowledge accounts for much more than we guess in all intelligence cycle. If the concept of intelligence cycle is scrutinized, it can be seen that it contains risks, threats as well as success. The main purpose of all organizations is to be successful by eliminating risks and threats. Therefore, there is a need to connect or fuse existing information and the processes which can be used to develop it. Thanks to this process the decision-makers can be presented with a clear holistic understanding, as early as possible in the decision making process. Altering from the current traditional reactive approach to a proactive intelligence cycle approach would reduce extensive duplication of work in the organization. Applying new result-oriented cycle and tacit knowledge intelligence can be procured and utilized more effectively and timely.Keywords: information, intelligence cycle, knowledge, tacit Knowledge
Procedia PDF Downloads 51223735 Embodying the Ecological Validity in Creating the Sustainable Public Policy: A Study in Strengthening the Green Economy in Indonesia
Authors: Gatot Dwi Hendro, Hayyan ul Haq
Abstract:
This work aims to explore the strategy in embodying the ecological validity in creating the sustainability of public policy, particularly in strengthening the green economy in Indonesia. This green economy plays an important role in supporting the national development in Indonesia, as it is a part of the national policy that posits the primary priority in Indonesian governance. The green economy refers to the national development covering strategic natural resources, such as mining, gold, oil, coal, forest, water, marine, and the other supporting infrastructure for products and distribution, such as fabrics, roads, bridges, and so forth. Thus, all activities in those national development should consider the sustainability. This sustainability requires the strong commitment of the national and regional government, as well as the local governments to put the ecology as the main requirement for issuing any policy, such as licence in mining production, and developing and building new production and supporting infrastructures for optimising the national resources. For that reason this work will focus on the strategy how to embody the ecological values and norms in the public policy. In detail, this work will offer the method, i.e. legal techniques, in visualising and embodying the norms and public policy that valid ecologically. This ecological validity is required in order to maintain and sustain our collective life.Keywords: ecological validity, sustainable development, coherence, Indonesian Pancasila values, environment, marine
Procedia PDF Downloads 48423734 Evaluation of the CRISP-DM Business Understanding Step: An Approach for Assessing the Predictive Power of Regression versus Classification for the Quality Prediction of Hydraulic Test Results
Authors: Christian Neunzig, Simon Fahle, Jürgen Schulz, Matthias Möller, Bernd Kuhlenkötter
Abstract:
Digitalisation in production technology is a driver for the application of machine learning methods. Through the application of predictive quality, the great potential for saving necessary quality control can be exploited through the data-based prediction of product quality and states. However, the serial use of machine learning applications is often prevented by various problems. Fluctuations occur in real production data sets, which are reflected in trends and systematic shifts over time. To counteract these problems, data preprocessing includes rule-based data cleaning, the application of dimensionality reduction techniques, and the identification of comparable data subsets to extract stable features. Successful process control of the target variables aims to centre the measured values around a mean and minimise variance. Competitive leaders claim to have mastered their processes. As a result, much of the real data has a relatively low variance. For the training of prediction models, the highest possible generalisability is required, which is at least made more difficult by this data availability. The implementation of a machine learning application can be interpreted as a production process. The CRoss Industry Standard Process for Data Mining (CRISP-DM) is a process model with six phases that describes the life cycle of data science. As in any process, the costs to eliminate errors increase significantly with each advancing process phase. For the quality prediction of hydraulic test steps of directional control valves, the question arises in the initial phase whether a regression or a classification is more suitable. In the context of this work, the initial phase of the CRISP-DM, the business understanding, is critically compared for the use case at Bosch Rexroth with regard to regression and classification. The use of cross-process production data along the value chain of hydraulic valves is a promising approach to predict the quality characteristics of workpieces. Suitable methods for leakage volume flow regression and classification for inspection decision are applied. Impressively, classification is clearly superior to regression and achieves promising accuracies.Keywords: classification, CRISP-DM, machine learning, predictive quality, regression
Procedia PDF Downloads 14323733 Implementation Association Rule Method in Determining the Layout of Qita Supermarket as a Strategy in the Competitive Retail Industry in Indonesia
Authors: Dwipa Rizki Utama, Hanief Ibrahim
Abstract:
The development of industry retail in Indonesia is very fast, various strategy was undertaken to boost the customer satisfaction and the productivity purchases to boost the profit, one of which is implementing strategies layout. The purpose of this study is to determine the layout of Qita supermarket, a retail industry in Indonesia, in order to improve customer satisfaction and to maximize the rate of products’ sale as a whole, so as the infrequently purchased products will be purchased. This research uses a literature study method, and one of the data mining methods is association rule which applied in market basket analysis. Data were tested amounted 100 from 160 after pre-processing data, so then the distribution department and 26 departments corresponding to the data previous layout will be obtained. From those data, by the association rule method, customer behavior when purchasing items simultaneously can be studied, so then the layout of the supermarket based on customer behavior can be determined. Using the rapid miner software by the minimal support 25% and minimal confidence 30% showed that the 14th department purchased at the same time with department 10, 21st department purchased at the same time with department 13, 15th department purchased at the same time with department 12, 14th department purchased at the same time with department 12, and 10th department purchased at the same time with department 14. From those results, a better supermarket layout can be arranged than the previous layout.Keywords: industry retail, strategy, association rule, supermarket
Procedia PDF Downloads 18723732 Preparing Data for Calibration of Mechanistic-Empirical Pavement Design Guide in Central Saudi Arabia
Authors: Abdulraaof H. Alqaili, Hamad A. Alsoliman
Abstract:
Through progress in pavement design developments, a pavement design method was developed, which is titled the Mechanistic Empirical Pavement Design Guide (MEPDG). Nowadays, the evolution in roads network and highways is observed in Saudi Arabia as a result of increasing in traffic volume. Therefore, the MEPDG currently is implemented for flexible pavement design by the Saudi Ministry of Transportation. Implementation of MEPDG for local pavement design requires the calibration of distress models under the local conditions (traffic, climate, and materials). This paper aims to prepare data for calibration of MEPDG in Central Saudi Arabia. Thus, the first goal is data collection for the design of flexible pavement from the local conditions of the Riyadh region. Since, the modifying of collected data to input data is needed; the main goal of this paper is the analysis of collected data. The data analysis in this paper includes processing each: Trucks Classification, Traffic Growth Factor, Annual Average Daily Truck Traffic (AADTT), Monthly Adjustment Factors (MAFi), Vehicle Class Distribution (VCD), Truck Hourly Distribution Factors, Axle Load Distribution Factors (ALDF), Number of axle types (single, tandem, and tridem) per truck class, cloud cover percent, and road sections selected for the local calibration. Detailed descriptions of input parameters are explained in this paper, which leads to providing of an approach for successful implementation of MEPDG. Local calibration of MEPDG to the conditions of Riyadh region can be performed based on the findings in this paper.Keywords: mechanistic-empirical pavement design guide (MEPDG), traffic characteristics, materials properties, climate, Riyadh
Procedia PDF Downloads 22523731 Transforming Data Science Curriculum Through Design Thinking
Authors: Samar Swaid
Abstract:
Today, corporates are moving toward the adoption of Design-Thinking techniques to develop products and services, putting their consumer as the heart of the development process. One of the leading companies in Design-Thinking, IDEO (Innovation, Design, Engineering Organization), defines Design-Thinking as an approach to problem-solving that relies on a set of multi-layered skills, processes, and mindsets that help people generate novel solutions to problems. Design thinking may result in new ideas, narratives, objects or systems. It is about redesigning systems, organizations, infrastructures, processes, and solutions in an innovative fashion based on the users' feedback. Tim Brown, president and CEO of IDEO, sees design thinking as a human-centered approach that draws from the designer's toolkit to integrate people's needs, innovative technologies, and business requirements. The application of design thinking has been witnessed to be the road to developing innovative applications, interactive systems, scientific software, healthcare application, and even to utilizing Design-Thinking to re-think business operations, as in the case of Airbnb. Recently, there has been a movement to apply design thinking to machine learning and artificial intelligence to ensure creating the "wow" effect on consumers. The Association of Computing Machinery task force on Data Science program states that" Data scientists should be able to implement and understand algorithms for data collection and analysis. They should understand the time and space considerations of algorithms. They should follow good design principles developing software, understanding the importance of those principles for testability and maintainability" However, this definition hides the user behind the machine who works on data preparation, algorithm selection and model interpretation. Thus, the Data Science program includes design thinking to ensure meeting the user demands, generating more usable machine learning tools, and developing ways of framing computational thinking. Here, describe the fundamentals of Design-Thinking and teaching modules for data science programs.Keywords: data science, design thinking, AI, currculum, transformation
Procedia PDF Downloads 7923730 Economized Sensor Data Processing with Vehicle Platooning
Authors: Henry Hexmoor, Kailash Yelasani
Abstract:
We present vehicular platooning as a special case of crowd-sensing framework where sharing sensory information among a crowd is used for their collective benefit. After offering an abstract policy that governs processes involving a vehicular platoon, we review several common scenarios and components surrounding vehicular platooning. We then present a simulated prototype that illustrates efficiency of road usage and vehicle travel time derived from platooning. We have argued that one of the paramount benefits of platooning that is overlooked elsewhere, is the substantial computational savings (i.e., economizing benefits) in acquisition and processing of sensory data among vehicles sharing the road. The most capable vehicle can share data gathered from its sensors with nearby vehicles grouped into a platoon.Keywords: cloud network, collaboration, internet of things, social network
Procedia PDF Downloads 19223729 Exchange Rate Forecasting by Econometric Models
Authors: Zahid Ahmad, Nosheen Imran, Nauman Ali, Farah Amir
Abstract:
The objective of the study is to forecast the US Dollar and Pak Rupee exchange rate by using time series models. For this purpose, daily exchange rates of US and Pakistan for the period of January 01, 2007 - June 2, 2017, are employed. The data set is divided into in sample and out of sample data set where in-sample data are used to estimate as well as forecast the models, whereas out-of-sample data set is exercised to forecast the exchange rate. The ADF test and PP test are used to make the time series stationary. To forecast the exchange rate ARIMA model and GARCH model are applied. Among the different Autoregressive Integrated Moving Average (ARIMA) models best model is selected on the basis of selection criteria. Due to the volatility clustering and ARCH effect the GARCH (1, 1) is also applied. Results of analysis showed that ARIMA (0, 1, 1 ) and GARCH (1, 1) are the most suitable models to forecast the future exchange rate. Further the GARCH (1,1) model provided the volatility with non-constant conditional variance in the exchange rate with good forecasting performance. This study is very useful for researchers, policymakers, and businesses for making decisions through accurate and timely forecasting of the exchange rate and helps them in devising their policies.Keywords: exchange rate, ARIMA, GARCH, PAK/USD
Procedia PDF Downloads 55823728 Short Term Distribution Load Forecasting Using Wavelet Transform and Artificial Neural Networks
Authors: S. Neelima, P. S. Subramanyam
Abstract:
The major tool for distribution planning is load forecasting, which is the anticipation of the load in advance. Artificial neural networks have found wide applications in load forecasting to obtain an efficient strategy for planning and management. In this paper, the application of neural networks to study the design of short term load forecasting (STLF) Systems was explored. Our work presents a pragmatic methodology for short term load forecasting (STLF) using proposed two-stage model of wavelet transform (WT) and artificial neural network (ANN). It is a two-stage prediction system which involves wavelet decomposition of input data at the first stage and the decomposed data with another input is trained using a separate neural network to forecast the load. The forecasted load is obtained by reconstruction of the decomposed data. The hybrid model has been trained and validated using load data from Telangana State Electricity Board.Keywords: electrical distribution systems, wavelet transform (WT), short term load forecasting (STLF), artificial neural network (ANN)
Procedia PDF Downloads 43523727 The Best Prediction Data Mining Model for Breast Cancer Probability in Women Residents in Kabul
Authors: Mina Jafari, Kobra Hamraee, Saied Hossein Hosseini
Abstract:
The prediction of breast cancer disease is one of the challenges in medicine. In this paper we collected 528 records of women’s information who live in Kabul including demographic, life style, diet and pregnancy data. There are many classification algorithm in breast cancer prediction and tried to find the best model with most accurate result and lowest error rate. We evaluated some other common supervised algorithms in data mining to find the best model in prediction of breast cancer disease among afghan women living in Kabul regarding to momography result as target variable. For evaluating these algorithms we used Cross Validation which is an assured method for measuring the performance of models. After comparing error rate and accuracy of three models: Decision Tree, Naive Bays and Rule Induction, Decision Tree with accuracy of 94.06% and error rate of %15 is found the best model to predicting breast cancer disease based on the health care records.Keywords: decision tree, breast cancer, probability, data mining
Procedia PDF Downloads 13623726 Image Steganography Using Least Significant Bit Technique
Authors: Preeti Kumari, Ridhi Kapoor
Abstract:
In any communication, security is the most important issue in today’s world. In this paper, steganography is the process of hiding the important data into other data, such as text, audio, video, and image. The interest in this topic is to provide availability, confidentiality, integrity, and authenticity of data. The steganographic technique that embeds hides content with unremarkable cover media so as not to provoke eavesdropper’s suspicion or third party and hackers. In which many applications of compression, encryption, decryption, and embedding methods are used for digital image steganography. Due to compression, the nose produces in the image. To sustain noise in the image, the LSB insertion technique is used. The performance of the proposed embedding system with respect to providing security to secret message and robustness is discussed. We also demonstrate the maximum steganography capacity and visual distortion.Keywords: steganography, LSB, encoding, information hiding, color image
Procedia PDF Downloads 472