Search results for: type-2 fuzzy sets
1270 Training Volume and Myoelectric Responses of Lower Body Muscles with Differing Foam Rolling Periods
Authors: Humberto Miranda, Haroldo G. Santana, Gabriel A. Paz, Vicente P. Lima, Jeffrey M. Willardson
Abstract:
Foam rolling is a practice that has increased in popularity before and after strength training. The purpose of this study was to compare the acute effects of different foam rolling periods for the lower body muscles on subsequent performance (total repetitions and training volume), myoelectric activity and rating of perceived exertion in trained men. Fourteen trained men (26.2 ± 3.2 years, 178 ± 0.04 cm height, 82.2 ± 10 kg weight and body mass index 25.9 ± 3.3kg/m2) volunteered for this study. Four repetition maximum (4-RM) loads were determined for hexagonal bar deadlift and 45º angled leg press during test and retest sessions over two nonconsecutive days. Five experimental protocols were applied in a randomized design, which included: a traditional protocol (control)—a resistance training session without prior foam rolling; or resistance training sessions performed following one (P1), two (P2), three (P3), or four (P4) sets of 30 sec. foam rolling for the lower extremity musculature. Subjects were asked to roll over the medial and lateral aspects of each muscle group with as much pressure as possible. All foam rolling was completed at a cadence of 50 bpm. These procedures were performed on both sides unilaterally as described below. Quadriceps: between the apex of the patella and the ASIS; Hamstring: between the gluteal fold and popliteal fossa; Triceps surae: between popliteal fossa and calcaneus tendon. The resistance training consisted of five sets with 4-RM loads and two-minute rest intervals between sets, and a four-minute rest interval between the hexagonal bar deadlift and the 45º angled leg press. The number of repetitions completed, the myoelectric activity of vastus lateralis (VL), vastus medialis oblique (VMO), semitendinosus (SM) and medial gastrocnemius (GM) were recorded, as well as the rating of perceived exertion for each protocol. There were no differences between the protocols in the total repetitions for the hexagonal bar deadlift (Control - 16.2 ± 5.9; P1 - 16.9 ± 5.5; P2 - 19.2 ± 5.7; P3 - 19.4 ± 5.2; P4 - 17.2 ± 8.2) (p > 0.05) and 45º angled leg press (Control - 23.3 ± 9.7; P1 - 25.9 ± 9.5; P2 - 29.1 ± 13.8; P3 - 28.0 ± 11.7; P4 - 30.2 ± 11.2) exercises. Similar results between protocols were also noted for myoelectric activity (p > 0.05) and rating of perceived exertion (p > 0.05). Therefore, the results of the present study indicated no deleterious effects on performance, myoelectric activity and rating of perceived exertion responses during lower body resistance training.Keywords: self myofascial release, foam rolling, electromyography, resistance training
Procedia PDF Downloads 2261269 Semantic Indexing Improvement for Textual Documents: Contribution of Classification by Fuzzy Association Rules
Authors: Mohsen Maraoui
Abstract:
In the aim of natural language processing applications improvement, such as information retrieval, machine translation, lexical disambiguation, we focus on statistical approach to semantic indexing for multilingual text documents based on conceptual network formalism. We propose to use this formalism as an indexing language to represent the descriptive concepts and their weighting. These concepts represent the content of the document. Our contribution is based on two steps. In the first step, we propose the extraction of index terms using the multilingual lexical resource Euro WordNet (EWN). In the second step, we pass from the representation of index terms to the representation of index concepts through conceptual network formalism. This network is generated using the EWN resource and pass by a classification step based on association rules model (in attempt to discover the non-taxonomic relations or contextual relations between the concepts of a document). These relations are latent relations buried in the text and carried by the semantic context of the co-occurrence of concepts in the document. Our proposed indexing approach can be applied to text documents in various languages because it is based on a linguistic method adapted to the language through a multilingual thesaurus. Next, we apply the same statistical process regardless of the language in order to extract the significant concepts and their associated weights. We prove that the proposed indexing approach provides encouraging results.Keywords: concept extraction, conceptual network formalism, fuzzy association rules, multilingual thesaurus, semantic indexing
Procedia PDF Downloads 1411268 Efficient Fuzzy Classified Cryptographic Model for Intelligent Encryption Technique towards E-Banking XML Transactions
Authors: Maher Aburrous, Adel Khelifi, Manar Abu Talib
Abstract:
Transactions performed by financial institutions on daily basis require XML encryption on large scale. Encrypting large volume of message fully will result both performance and resource issues. In this paper a novel approach is presented for securing financial XML transactions using classification data mining (DM) algorithms. Our strategy defines the complete process of classifying XML transactions by using set of classification algorithms, classified XML documents processed at later stage using element-wise encryption. Classification algorithms were used to identify the XML transaction rules and factors in order to classify the message content fetching important elements within. We have implemented four classification algorithms to fetch the importance level value within each XML document. Classified content is processed using element-wise encryption for selected parts with "High", "Medium" or “Low” importance level values. Element-wise encryption is performed using AES symmetric encryption algorithm and proposed modified algorithm for AES to overcome the problem of computational overhead, in which substitute byte, shift row will remain as in the original AES while mix column operation is replaced by 128 permutation operation followed by add round key operation. An implementation has been conducted using data set fetched from e-banking service to present system functionality and efficiency. Results from our implementation showed a clear improvement in processing time encrypting XML documents.Keywords: XML transaction, encryption, Advanced Encryption Standard (AES), XML classification, e-banking security, fuzzy classification, cryptography, intelligent encryption
Procedia PDF Downloads 4121267 Scrutiny and Solving Analytically Nonlinear Differential at Engineering Field of Fluids, Heat, Mass and Wave by New Method AGM
Authors: Mohammadreza Akbari, Sara Akbari, Davood Domiri Ganji, Pooya Solimani, Reza Khalili
Abstract:
As all experts know most of engineering system behavior in practical are nonlinear process (especially heat, fluid and mass, etc.) and analytical solving (no numeric) these problems are difficult, complex and sometimes impossible like (fluids and gas wave, these problems can't solve with numeric method, because of no have boundary condition) accordingly in this symposium we are going to exposure a innovative approach which we have named it Akbari-Ganji's Method or AGM in engineering, that can solve sets of coupled nonlinear differential equations (ODE, PDE) with high accuracy and simple solution and so this issue will be emerged after comparing the achieved solutions by Numerical method (Runge-Kutte 4th) and so compare to other methods such as HPM, ADM,… and exact solutions. Eventually, AGM method will be proved that could be created huge evolution for researchers, professors and students (engineering and basic science) in whole over the world, because of AGM coding system, so by using this software we can analytically solve all complicated linear and nonlinear differential equations, with help of that there is no difficulty for solving nonlinear differential equations(ODE and PDE). In this paper, we investigate and solve 4 types of the nonlinear differential equation with AGM method : 1-Heat and fluid, 2-Unsteady state of nonlinear partial differential, 3-Coupled nonlinear partial differential in wave equation, and 4-Nonlinear integro-differential equation.Keywords: new method AGM, sets of coupled nonlinear equations at engineering field, waves equations, integro-differential, fluid and thermal
Procedia PDF Downloads 5481266 Optimum Performance of the Gas Turbine Power Plant Using Adaptive Neuro-Fuzzy Inference System and Statistical Analysis
Authors: Thamir K. Ibrahim, M. M. Rahman, Marwah Noori Mohammed
Abstract:
This study deals with modeling and performance enhancements of a gas-turbine combined cycle power plant. A clean and safe energy is the greatest challenges to meet the requirements of the green environment. These requirements have given way the long-time governing authority of steam turbine (ST) in the world power generation, and the gas turbine (GT) will replace it. Therefore, it is necessary to predict the characteristics of the GT system and optimize its operating strategy by developing a simulation system. The integrated model and simulation code for exploiting the performance of gas turbine power plant are developed utilizing MATLAB code. The performance code for heavy-duty GT and CCGT power plants are validated with the real power plant of Baiji GT and MARAFIQ CCGT plants the results have been satisfactory. A new technology of correlation was considered for all types of simulation data; whose coefficient of determination (R2) was calculated as 0.9825. Some of the latest launched correlations were checked on the Baiji GT plant and apply error analysis. The GT performance was judged by particular parameters opted from the simulation model and also utilized Adaptive Neuro-Fuzzy System (ANFIS) an advanced new optimization technology. The best thermal efficiency and power output attained were about 56% and 345MW respectively. Thus, the operation conditions and ambient temperature are strongly influenced on the overall performance of the GT. The optimum efficiency and power are found at higher turbine inlet temperatures. It can be comprehended that the developed models are powerful tools for estimating the overall performance of the GT plants.Keywords: gas turbine, optimization, ANFIS, performance, operating conditions
Procedia PDF Downloads 4261265 Observation on Microbiological Profile of Type2 Diabetic Foot Ulcer and Its Antimicrobial Sensitivity Pattern in a Tertiary Care Hospital in Eastern India
Authors: Pampita Chakraborty, Sukumar Mukherjee
Abstract:
Diabetes Mellitus (DM) is commonly encountered metabolic disorder in clinical practice. An estimated 25 percent of DM patients develop foot problems. Foot ulceration and infection are one of the major causes of morbidity, hospitalization or even amputation. Objective: To isolate and identify bacterial pathogens in Diabetic Foot Ulcer (DFU) and to observe its antimicrobial sensitivity pattern. Methodology: A prospective study was conducted for a period of 9 months at the Department of Microbiology, GD Hospital & Diabetes Institute, Kolkata. 75 DFU patients were recruited in the study. Specimens for microbiological studies obtained from ulcer base were examined as gram stained smear and was cultured aerobically on Nutrient agar, Blood agar and MacConkey agar plates. Antimicrobial sensitivity test was performed by disc diffusion techniques according to CLSI guidelines. Result: In this study out of 75cases, 73% (55/75) were male and 27% (20/75) were females with mean (SD) age of 51.11(±10) years. Out of 75 pus cultures, 63(84%) showed growth of microorganism making total of 81 bacterial isolates with 71.42% of monomicrobial infection and 28.57% of polymicrobial infection. Out of 81 isolates 53(65.43%) were gram negative and 21(25.92%) were gram positive. E.coli was relatively common isolate 21(26%) followed by Staphylococcus aureus 15(18.5%), Klebsiella pneumonia 14(17.28%), Pseudomonas aeruginosa 12 (14.81%), Proteus spp. 3 (3.70%), and Enterococcus faecalis 6 (7.40%). 75% of Gram-negative microorganism were extended Beta-lactamase enzyme (ESBL) producer and around 20 % of Klebsiella and Proteus spp. were carbapenemase enzyme producer. Among Gram positive, around 50% of S.aureus was MRSA, sensitive only to Vancomycin, Teicoplanin & Linezolid. Conclusion: More prevalence of monomicrobial gram-negative bacteria than gram-positive bacteria in DFU was observed. This study emphasizes that Beta-Lactam group of antibiotics should not be the empirical treatment of choice for Gram-negative isolates; instead alternatives like Carbapenems, Amikacin could be a better option. On the other hand, Vancomycin and Linezolid are preferred for most of the infection with gram-positive aerobes. Continuous surveillance of resistant bacteria is required for empiric therapy.Keywords: antibiotic resistant, antimicrobial susceptibility, diabetic foot ulcer, surveillance
Procedia PDF Downloads 3691264 The Application of Sequence Stratigraphy to the Sajau (Pliocene) Coal Distribution in Berau Basin, Northeast Kalimantan, Indonesia
Authors: Ahmad Helman Hamdani, Diana Putri Hamdiana
Abstract:
The Sajau coal measures of Berau Basin, northeastern Kalimantan were deposited within a range of facies associations spanning a spectrum of settings from fluvial to marine. The transitional to terrestrial coal measures are dominated by siliciclastics, but they also contain three laterally extensive marine bands (mudstone). These bands act as marker horizons that enable correlation between fully marine and terrestrial facies. Examination of this range of facies and their sedimentology has enabled the development of a high-resolution sequence stratigraphic framework. Set against the established backdrop of third-order Sajau transgression, nine fourth-order sequences are recognized. Results show that, in the composite sequences, peat accumulation predominantly correlates in transitional areas with early transgressive sequence sets (TSS) and highstand sequence set (HSS), while in more landward areas it correlates with the middle TSS to late highstand sequence sets (HSS). Differences in peat accumulation regimes within the sequence stratigraphic framework are attributed to variations in subsidence and background siliciclastic input rates in different depositional settings, with these combining to produce differences in the rate of accommodation change. The preservation of coal resources in the middle to late HSS in this area was most likely related to the rise of the regional base level throughout the Sajau.Keywords: sequence stratigraphy, coal, Pliocene, Berau basin
Procedia PDF Downloads 4661263 Using Collaborative Planning to Develop a Guideline for Integrating Biodiversity into Land Use Schemes
Authors: Sagwata A. Manyike, Hulisani Magada
Abstract:
The South African National Biodiversity Institute is in the process of developing a guideline which sets out how biodiversity can be incorporated into land use (zoning) schemes. South Africa promulgated its Spatial Planning and Land Use Management Act in 2015 and the act seeks, amongst other things, to bridge the gap between spatial planning and land use management within the country. In addition, the act requires local governments to develop wall-to-wall land use schemes for their entire jurisdictions as they had previously only developed them for their urban areas. At the same time, South Africa has a rich history of systematic conservation planning whereby Critical Biodiversity Areas and Ecological Support Areas have been spatially delineated at a scale appropriate for spatial planning and land use management at the scale of local government. South Africa is also in the process of spatially delineating ecological infrastructure which is defined as naturally occurring ecosystems which provide valuable services to people such as water and climate regulation, soil formation, disaster risk reduction, etc. The Biodiversity and Land Use Project, which is funded by the Global Environmental Facility through the United Nations Development Programme is seeking to explore ways in which biodiversity information and ecological infrastructure can be incorporated into the spatial planning and land use management systems of local governments. Towards this end, the Biodiversity and Land Use Project have developed a guideline which sets out how local governments can integrate biodiversity into their land-use schemes as a way of not only ensuring sustainable development but also as a way helping them prepare for climate change. In addition, by incorporating biodiversity into land-use schemes, the project is exploring new ways of protecting biodiversity through land use schemes. The Guideline for Incorporating Biodiversity into Land Use Schemes was developed as a response to the fact that the National Land Use Scheme Guidelines only indicates that local governments needed to incorporate biodiversity without explaining how this could be achieved. The Natioanl Guideline also failed to specify which biodiversity-related layers are compatible with which land uses or what the benefits of incorporating biodiversity into the schemes will be for that local government. The guideline, therefore, sets out an argument for why biodiversity is important in land management processes and proceeds to provide a step by step guideline for how schemes can integrate priority biodiversity layers. This guideline will further be added as an addendum to the National Land Use Guidelines. Although the planning act calls for local government to have wall to wall schemes within 5 years of its enactment, many municipalities will not meet this deadline and so this guideline will support them in the development of their new schemes.Keywords: biodiversity, climate change, land use schemes, local government
Procedia PDF Downloads 1781262 Improving Comfort and Energy Mastery: Application of a Method Based on Indicators Morpho-Energetic
Authors: Khadidja Rahmani, Nahla Bouaziz
Abstract:
The climate change and the economic crisis, which are currently running, are the origin of the emergence of many issues and problems, which are related to the domain of energy and environment in à direct or indirect manner. Since the urban space is the core element and the key to solve the current problem, particular attention is given to it in this study. For this reason, we rented to the later a very particular attention; this is for the opportunities that it provides and that can be invested to attenuate a little this situation, which is disastrous and worried, especially in the face of the requirements of sustainable development. Indeed, the purpose of this work is to develop a method, which will allow us to guide designers towards projects with a certain degree of thermo-aeraulic comfort while requiring a minimum energy consumption. In this context, the architects, the urban planners and the engineers (energeticians) have to collaborate jointly to establish a method based on indicators for the improvement of the urban environmental quality (aeraulic-thermo comfort), correlated with a reduction in the energy demand of the entities that make up this environment, in areas with a sub-humid climate. In order to test the feasibility and to validate the method developed in this work, we carried out a series of simulations using computer-based simulation. This research allows us to evaluate the impact of the use of the indicators in the design of the urban sets, on the economic and ecological plan. Using this method, we prove that an urban design, which carefully considered energetically, can contribute significantly to the preservation of the environment and the reduction of the consumption of energy.Keywords: comfort, energy consumption, energy mastery, morpho-energetic indicators, simulation, sub-humid climate, urban sets
Procedia PDF Downloads 2761261 Harmonic Assessment and Mitigation in Medical Diagonesis Equipment
Authors: S. S. Adamu, H. S. Muhammad, D. S. Shuaibu
Abstract:
Poor power quality in electrical power systems can lead to medical equipment at healthcare centres to malfunction and present wrong medical diagnosis. Equipment such as X-rays, computerized axial tomography, etc. can pollute the system due to their high level of harmonics production, which may cause a number of undesirable effects like heating, equipment damages and electromagnetic interferences. The conventional approach of mitigation uses passive inductor/capacitor (LC) filters, which has some drawbacks such as, large sizes, resonance problems and fixed compensation behaviours. The current trends of solutions generally employ active power filters using suitable control algorithms. This work focuses on assessing the level of Total Harmonic Distortion (THD) on medical facilities and various ways of mitigation, using radiology unit of an existing hospital as a case study. The measurement of the harmonics is conducted with a power quality analyzer at the point of common coupling (PCC). The levels of measured THD are found to be higher than the IEEE 519-1992 standard limits. The system is then modelled as a harmonic current source using MATLAB/SIMULINK. To mitigate the unwanted harmonic currents a shunt active filter is developed using synchronous detection algorithm to extract the fundamental component of the source currents. Fuzzy logic controller is then developed to control the filter. The THD without the active power filter are validated using the measured values. The THD with the developed filter show that the harmonics are now within the recommended limits.Keywords: power quality, total harmonics distortion, shunt active filters, fuzzy logic
Procedia PDF Downloads 4791260 A Tool for Facilitating an Institutional Risk Profile Definition
Authors: Roman Graf, Sergiu Gordea, Heather M. Ryan
Abstract:
This paper presents an approach for the easy creation of an institutional risk profile for endangerment analysis of file formats. The main contribution of this work is the employment of data mining techniques to support risk factors set up with just the most important values that are important for a particular organisation. Subsequently, the risk profile employs fuzzy models and associated configurations for the file format metadata aggregator to support digital preservation experts with a semi-automatic estimation of endangerment level for file formats. Our goal is to make use of a domain expert knowledge base aggregated from a digital preservation survey in order to detect preservation risks for a particular institution. Another contribution is support for visualisation and analysis of risk factors for a requried dimension. The proposed methods improve the visibility of risk factor information and the quality of a digital preservation process. The presented approach is meant to facilitate decision making for the preservation of digital content in libraries and archives using domain expert knowledge and automatically aggregated file format metadata from linked open data sources. To facilitate decision-making, the aggregated information about the risk factors is presented as a multidimensional vector. The goal is to visualise particular dimensions of this vector for analysis by an expert. The sample risk profile calculation and the visualisation of some risk factor dimensions is presented in the evaluation section.Keywords: digital information management, file format, endangerment analysis, fuzzy models
Procedia PDF Downloads 4061259 Effective Stacking of Deep Neural Models for Automated Object Recognition in Retail Stores
Authors: Ankit Sinha, Soham Banerjee, Pratik Chattopadhyay
Abstract:
Automated product recognition in retail stores is an important real-world application in the domain of Computer Vision and Pattern Recognition. In this paper, we consider the problem of automatically identifying the classes of the products placed on racks in retail stores from an image of the rack and information about the query/product images. We improve upon the existing approaches in terms of effectiveness and memory requirement by developing a two-stage object detection and recognition pipeline comprising of a Faster-RCNN-based object localizer that detects the object regions in the rack image and a ResNet-18-based image encoder that classifies the detected regions into the appropriate classes. Each of the models is fine-tuned using appropriate data sets for better prediction and data augmentation is performed on each query image to prepare an extensive gallery set for fine-tuning the ResNet-18-based product recognition model. This encoder is trained using a triplet loss function following the strategy of online-hard-negative-mining for improved prediction. The proposed models are lightweight and can be connected in an end-to-end manner during deployment to automatically identify each product object placed in a rack image. Extensive experiments using Grozi-32k and GP-180 data sets verify the effectiveness of the proposed model.Keywords: retail stores, faster-RCNN, object localization, ResNet-18, triplet loss, data augmentation, product recognition
Procedia PDF Downloads 1571258 Eliciting and Confirming Data, Information, Knowledge and Wisdom in a Specialist Health Care Setting - The Wicked Method
Authors: Sinead Impey, Damon Berry, Selma Furtado, Miriam Galvin, Loretto Grogan, Orla Hardiman, Lucy Hederman, Mark Heverin, Vincent Wade, Linda Douris, Declan O'Sullivan, Gaye Stephens
Abstract:
Healthcare is a knowledge-rich environment. This knowledge, while valuable, is not always accessible outside the borders of individual clinics. This research aims to address part of this problem (at a study site) by constructing a maximal data set (knowledge artefact) for motor neurone disease (MND). This data set is proposed as an initial knowledge base for a concurrent project to develop an MND patient data platform. It represents the domain knowledge at the study site for the duration of the research (12 months). A knowledge elicitation method was also developed from the lessons learned during this process - the WICKED method. WICKED is an anagram of the words: eliciting and confirming data, information, knowledge, wisdom. But it is also a reference to the concept of wicked problems, which are complex and challenging, as is eliciting expert knowledge. The method was evaluated at a second site, and benefits and limitations were noted. Benefits include that the method provided a systematic way to manage data, information, knowledge and wisdom (DIKW) from various sources, including healthcare specialists and existing data sets. Limitations surrounded the time required and how the data set produced only represents DIKW known during the research period. Future work is underway to address these limitations.Keywords: healthcare, knowledge acquisition, maximal data sets, action design science
Procedia PDF Downloads 3671257 Iterative Method for Lung Tumor Localization in 4D CT
Authors: Sarah K. Hagi, Majdi Alnowaimi
Abstract:
In the last decade, there were immense advancements in the medical imaging modalities. These advancements can scan a whole volume of the lung organ in high resolution images within a short time. According to this performance, the physicians can clearly identify the complicated anatomical and pathological structures of lung. Therefore, these advancements give large opportunities for more advance of all types of lung cancer treatment available and will increase the survival rate. However, lung cancer is still one of the major causes of death with around 19% of all the cancer patients. Several factors may affect survival rate. One of the serious effects is the breathing process, which can affect the accuracy of diagnosis and lung tumor treatment plan. We have therefore developed a semi automated algorithm to localize the 3D lung tumor positions across all respiratory data during respiratory motion. The algorithm can be divided into two stages. First, a lung tumor segmentation for the first phase of the 4D computed tomography (CT). Lung tumor segmentation is performed using an active contours method. Then, localize the tumor 3D position across all next phases using a 12 degrees of freedom of an affine transformation. Two data set where used in this study, a compute simulate for 4D CT using extended cardiac-torso (XCAT) phantom and 4D CT clinical data sets. The result and error calculation is presented as root mean square error (RMSE). The average error in data sets is 0.94 mm ± 0.36. Finally, evaluation and quantitative comparison of the results with a state-of-the-art registration algorithm was introduced. The results obtained from the proposed localization algorithm show a promising result to localize alung tumor in 4D CT data.Keywords: automated algorithm , computed tomography, lung tumor, tumor localization
Procedia PDF Downloads 6051256 Important Factors Affecting the Effectiveness of Quality Control Circles
Authors: Sogol Zarafshan
Abstract:
The present study aimed to identify important factors affecting the effectiveness of quality control circles in a hospital, as well as rank them using a combination of fuzzy VIKOR and Grey Relational Analysis (GRA). The study population consisted of five academic members and five experts in the field of nursing working in a hospital, who were selected using a purposive sampling method. Also, a sample of 107 nurses was selected through a simple random sampling method using their employee codes and the random-number table. The required data were collected using a researcher-made questionnaire which consisted of 12 factors. The validity of this questionnaire was confirmed through giving the opinions of experts and academic members who participated in the present study, as well as performing confirmatory factor analysis. Its reliability also was verified (α=0.796). The collected data were analyzed using SPSS 22.0 and LISREL 8.8, as well as VIKOR–GRA and IPA methods. The results of ranking the factors affecting the effectiveness of quality control circles showed that the highest and lowest ranks were related to ‘Managers’ and supervisors’ support’ and ‘Group leadership’. Also, the highest hospital performance was for factors such as ‘Clear goals and objectives’ and ‘Group cohesiveness and homogeneity’, and the lowest for ‘Reward system’ and ‘Feedback system’, respectively. The results showed that although ‘Training the members’, ‘Using the right tools’ and ‘Reward system’ were factors that were of great importance, the organization’s performance for these factors was poor. Therefore, these factors should be paid more attention by the studied hospital managers and should be improved as soon as possible.Keywords: Quality control circles, Fuzzy VIKOR, Grey Relational Analysis, Importance–Performance Analysis
Procedia PDF Downloads 1381255 Refining Scheme Using Amphibious Epistemologies
Authors: David Blaine, George Raschbaum
Abstract:
The evaluation of DHCP has synthesized SCSI disks, and current trends suggest that the exploration of e-business that would allow for further study into robots will soon emerge. Given the current status of embedded algorithms, hackers worldwide obviously desire the exploration of replication, which embodies the confusing principles of programming languages. In our research we concentrate our efforts on arguing that erasure coding can be made "fuzzy", encrypted, and game-theoretic.Keywords: SCHI disks, robot, algorithm, hacking, programming language
Procedia PDF Downloads 4301254 Recent Developments in the Application of Deep Learning to Stock Market Prediction
Authors: Shraddha Jain Sharma, Ratnalata Gupta
Abstract:
Predicting stock movements in the financial market is both difficult and rewarding. Analysts and academics are increasingly using advanced approaches such as machine learning techniques to anticipate stock price patterns, thanks to the expanding capacity of computing and the recent advent of graphics processing units and tensor processing units. Stock market prediction is a type of time series prediction that is incredibly difficult to do since stock prices are influenced by a variety of financial, socioeconomic, and political factors. Furthermore, even minor mistakes in stock market price forecasts can result in significant losses for companies that employ the findings of stock market price prediction for financial analysis and investment. Soft computing techniques are increasingly being employed for stock market prediction due to their better accuracy than traditional statistical methodologies. The proposed research looks at the need for soft computing techniques in stock market prediction, the numerous soft computing approaches that are important to the field, past work in the area with their prominent features, and the significant problems or issue domain that the area involves. For constructing a predictive model, the major focus is on neural networks and fuzzy logic. The stock market is extremely unpredictable, and it is unquestionably tough to correctly predict based on certain characteristics. This study provides a complete overview of the numerous strategies investigated for high accuracy prediction, with a focus on the most important characteristics.Keywords: stock market prediction, artificial intelligence, artificial neural networks, fuzzy logic, accuracy, deep learning, machine learning, stock price, trading volume
Procedia PDF Downloads 921253 Data Analytics in Energy Management
Authors: Sanjivrao Katakam, Thanumoorthi I., Antony Gerald, Ratan Kulkarni, Shaju Nair
Abstract:
With increasing energy costs and its impact on the business, sustainability today has evolved from a social expectation to an economic imperative. Therefore, finding methods to reduce cost has become a critical directive for Industry leaders. Effective energy management is the only way to cut costs. However, Energy Management has been a challenge because it requires a change in old habits and legacy systems followed for decades. Today exorbitant levels of energy and operational data is being captured and stored by Industries, but they are unable to convert these structured and unstructured data sets into meaningful business intelligence. It must be noted that for quick decisions, organizations must learn to cope with large volumes of operational data in different formats. Energy analytics not only helps in extracting inferences from these data sets, but also is instrumental in transformation from old approaches of energy management to new. This in turn assists in effective decision making for implementation. It is the requirement of organizations to have an established corporate strategy for reducing operational costs through visibility and optimization of energy usage. Energy analytics play a key role in optimization of operations. The paper describes how today energy data analytics is extensively used in different scenarios like reducing operational costs, predicting energy demands, optimizing network efficiency, asset maintenance, improving customer insights and device data insights. The paper also highlights how analytics helps transform insights obtained from energy data into sustainable solutions. The paper utilizes data from an array of segments such as retail, transportation, and water sectors.Keywords: energy analytics, energy management, operational data, business intelligence, optimization
Procedia PDF Downloads 3651252 Low-Cost Image Processing System for Evaluating Pavement Surface Distress
Authors: Keerti Kembhavi, M. R. Archana, V. Anjaneyappa
Abstract:
Most asphalt pavement condition evaluation use rating frameworks in which asphalt pavement distress is estimated by type, extent, and severity. Rating is carried out by the pavement condition rating (PCR), which is tedious and expensive. This paper presents the development of a low-cost technique for image pavement distress analysis that permits the identification of pothole and cracks. The paper explores the application of image processing tools for the detection of potholes and cracks. Longitudinal cracking and pothole are detected using Fuzzy-C- Means (FCM) and proceeded with the Spectral Theory algorithm. The framework comprises three phases, including image acquisition, processing, and extraction of features. A digital camera (Gopro) with the holder is used to capture pavement distress images on a moving vehicle. FCM classifier and Spectral Theory algorithms are used to compute features and classify the longitudinal cracking and pothole. The Matlab2016Ra Image preparing tool kit utilizes performance analysis to identify the viability of pavement distress on selected urban stretches of Bengaluru city, India. The outcomes of image evaluation with the utilization semi-computerized image handling framework represented the features of longitudinal crack and pothole with an accuracy of about 80%. Further, the detected images are validated with the actual dimensions, and it is seen that dimension variability is about 0.46. The linear regression model y=1.171x-0.155 is obtained using the existing and experimental / image processing area. The R2 correlation square obtained from the best fit line is 0.807, which is considered in the linear regression model to be ‘large positive linear association’.Keywords: crack detection, pothole detection, spectral clustering, fuzzy-c-means
Procedia PDF Downloads 1821251 Comprehensive Risk Analysis of Decommissioning Activities with Multifaceted Hazard Factors
Authors: Hyeon-Kyo Lim, Hyunjung Kim, Kune-Woo Lee
Abstract:
Decommissioning process of nuclear facilities can be said to consist of a sequence of problem solving activities, partly because there may exist working environments contaminated by radiological exposure, and partly because there may also exist industrial hazards such as fire, explosions, toxic materials, and electrical and physical hazards. As for an individual hazard factor, risk assessment techniques are getting known to industrial workers with advance of safety technology, but the way how to integrate those results is not. Furthermore, there are few workers who experienced decommissioning operations a lot in the past. Therefore, not a few countries in the world have been trying to develop appropriate counter techniques in order to guarantee safety and efficiency of the process. In spite of that, there still exists neither domestic nor international standard since nuclear facilities are too diverse and unique. In the consequence, it is quite inevitable to imagine and assess the whole risk in the situation anticipated one by one. This paper aimed to find out an appropriate technique to integrate individual risk assessment results from the viewpoint of experts. Thus, on one hand the whole risk assessment activity for decommissioning operations was modeled as a sequence of individual risk assessment steps, and on the other, a hierarchical risk structure was developed. Then, risk assessment procedure that can elicit individual hazard factors one by one were introduced with reference to the standard operation procedure (SOP) and hierarchical task analysis (HTA). With an assumption of quantification and normalization of individual risks, a technique to estimate relative weight factors was tried by using the conventional Analytic Hierarchical Process (AHP) and its result was reviewed with reference to judgment of experts. Besides, taking the ambiguity of human judgment into consideration, debates based upon fuzzy inference was added with a mathematical case study.Keywords: decommissioning, risk assessment, analytic hierarchical process (AHP), fuzzy inference
Procedia PDF Downloads 4251250 Resilience-Vulnerability Interaction in the Context of Disasters and Complexity: Study Case in the Coastal Plain of Gulf of Mexico
Authors: Cesar Vazquez-Gonzalez, Sophie Avila-Foucat, Leonardo Ortiz-Lozano, Patricia Moreno-Casasola, Alejandro Granados-Barba
Abstract:
In the last twenty years, academic and scientific literature has been focused on understanding the processes and factors of coastal social-ecological systems vulnerability and resilience. Some scholars argue that resilience and vulnerability are isolated concepts due to their epistemological origin, while others note the existence of a strong resilience-vulnerability relationship. Here we present an ordinal logistic regression model based on the analytical framework about dynamic resilience-vulnerability interaction along adaptive cycle of complex systems and disasters process phases (during, recovery and learning). In this way, we demonstrate that 1) during the disturbance, absorptive capacity (resilience as a core of attributes) and external response capacity explain the probability of households capitals to diminish the damage, and exposure sets the thresholds about the amount of disturbance that households can absorb, 2) at recovery, absorptive capacity and external response capacity explain the probability of households capitals to recovery faster (resilience as an outcome) from damage, and 3) at learning, adaptive capacity (resilience as a core of attributes) explains the probability of households adaptation measures based on the enhancement of physical capital. As a result, during the disturbance phase, exposure has the greatest weight in the probability of capital’s damage, and households with absorptive and external response capacity elements absorbed the impact of floods in comparison with households without these elements. At the recovery phase, households with absorptive and external response capacity showed a faster recovery on their capital; however, the damage sets the thresholds of recovery time. More importantly, diversity in financial capital increases the probability of recovering other capital, but it becomes a liability so that the probability of recovering the household finances in a longer time increases. At learning-reorganizing phase, adaptation (modifications to the house) increases the probability of having less damage on physical capital; however, it is not very relevant. As conclusion, resilience is an outcome but also core of attributes that interacts with vulnerability along the adaptive cycle and disaster process phases. Absorptive capacity can diminish the damage experienced by floods; however, when exposure overcomes thresholds, both absorptive and external response capacity are not enough. In the same way, absorptive and external response capacity diminish the recovery time of capital, but the damage sets the thresholds in where households are not capable of recovering their capital.Keywords: absorptive capacity, adaptive capacity, capital, floods, recovery-learning, social-ecological systems
Procedia PDF Downloads 1341249 Effects of Handgrip Isometric Training in Blood Pressure of Patients with Peripheral Artery Disease
Authors: Raphael M. Ritti-Dias, Marilia A. Correia, Wagner J. R. Domingues, Aline C. Palmeira, Paulo Longano, Nelson Wolosker, Lauro C. Vianna, Gabriel G. Cucato
Abstract:
Patients with peripheral arterial disease (PAD) have a high prevalence of hypertension, which contributes to a high risk of acute cardiovascular events and cardiovascular mortality. Strategies to reduce cardiovascular risk of these patients are needed. Meta-analysis studies have shown that isometric handgrip training promotes reductions in clinical blood pressure in normotensive, pre-hypertensive and hypertensive individuals. However, the effect of this exercise training on other cardiovascular function indicators in PAD patients remains unknown. Thus, the aim of this study was to analyze the effects of isometric handgrip training on blood pressure in patients with PAD. In this clinical trial, 28 patients were randomly allocated into two groups: isometric handgrip training (HG) and control (CG). The HG conducted the unilateral handgrip training three days per week (four sets of two minutes, with 30% of maximum voluntary contraction with an interval of four minutes between sets). CG was encouraged to increase their physical activity levels. At baseline and after eight weeks blood pressure and heart rate were obtained. ANOVA two-way for repeated measures with the group (GH and GC) and time (pre- and post-intervention) as factors was performed. After 8 weeks of training there were no significant changes in systolic blood pressure (HG pre 141 ± 24.0 mmHg vs. HG post 142 ± 22.0 mmHg; CG pre 140 ± 22.1 mmHg vs. CG post 146 ± 16.2 mmHg; P=0.18), diastolic blood pressure (HG pre 74 ± 10.4 mmHg vs. HG post 74 ± 11.9 mmHg; CG pre 72 ± 6.9 mmHg vs. CG post 74 ± 8.0 mmHg; P=0.22) and heart rate (HG pre 61 ± 10.5 bpm vs. HG post 62 ± 8.0 bpm; CG pre 64 ± 11.8 bpm vs. CG post 65 ± 13.6 bpm; P=0.81). In conclusion, our preliminary data indicate that isometric handgrip training did not modify blood pressure and heart rate in patients with PAD.Keywords: blood pressure, exercise, isometric, peripheral artery disease
Procedia PDF Downloads 3331248 Supervised Machine Learning Approach for Studying the Effect of Different Joint Sets on Stability of Mine Pit Slopes Under the Presence of Different External Factors
Authors: Sudhir Kumar Singh, Debashish Chakravarty
Abstract:
Slope stability analysis is an important aspect in the field of geotechnical engineering. It is also important from safety, and economic point of view as any slope failure leads to loss of valuable lives and damage to property worth millions. This paper aims at mitigating the risk of slope failure by studying the effect of different joint sets on the stability of mine pit slopes under the influence of various external factors, namely degree of saturation, rainfall intensity, and seismic coefficients. Supervised machine learning approach has been utilized for making accurate and reliable predictions regarding the stability of slopes based on the value of Factor of Safety. Numerous cases have been studied for analyzing the stability of slopes using the popular Finite Element Method, and the data thus obtained has been used as training data for the supervised machine learning models. The input data has been trained on different supervised machine learning models, namely Random Forest, Decision Tree, Support vector Machine, and XGBoost. Distinct test data that is not present in training data has been used for measuring the performance and accuracy of different models. Although all models have performed well on the test dataset but Random Forest stands out from others due to its high accuracy of greater than 95%, thus helping us by providing a valuable tool at our disposition which is neither computationally expensive nor time consuming and in good accordance with the numerical analysis result.Keywords: finite element method, geotechnical engineering, machine learning, slope stability
Procedia PDF Downloads 1021247 The Effect of Hypertrophy Strength Training Using Traditional Set vs. Cluster Set on Maximum Strength and Sprinting Speed
Authors: Bjornar Kjellstadli, Shaher A. I. Shalfawi
Abstract:
The aim of this study was to investigate the effect of strength training Cluster set-method compared to traditional set-method 30 m sprinting time and maximum strength in squats and bench-press. Thirteen Physical Education students, 7 males and 6 females between the age of 19-28 years old were recruited. The students were random divided in three groups. Traditional set group (TSG) consist of 2 males and 2 females aged (±SD) (22.3 ± 1.5 years), body mass (79.2 ± 15.4 kg) and height (177.5 ± 11.3 cm). Cluster set group (CSG) consist of 3 males and 2 females aged (22.4 ± 3.29 years), body mass (81.0 ± 24.0 kg) and height (179.2 ± 11.8 cm) and a control group (CG) consist of 2 males and 2 females aged (21.5 ± 2.4 years), body mass (82.1 ± 17.4 kg) and height (175.5 ± 6.7 cm). The intervention consisted of performing squat and bench press at 70% of 1RM (twice a week) for 8 weeks using 10 repetition and 4 sets. Two types of strength-training methods were used , cluster set (CS) where the participants (CSG) performed 2 reps 5 times with a 10 s recovery in between reps and 50 s recovery between sets, and traditional set (TS) where the participants (TSG) performed 10 reps each set with 90 s recovery in between sets. The pre-tests and post-tests conducted were 1 RM in both squats and bench press, and 10 and 30 m sprint time. The 1RM test were performed with Eleiko XF barbell (20 kg), Eleiko weight plates, rack and bench from Hammerstrength. The speed test was measured with the Brower speed trap II testing system (Brower Timing Systems, Utah, USA). The participants received an individualized training program based on the pre-test of the 1RM. In addition, a mid-term test of 1RM was carried out to adjust training intensity. Each training session were supervised by the researchers. Beast sensors (Milano, Italy) were also used to monitor and quantify the training load for the participants. All groups had a statistical significant improvement in bench press 1RM (TSG 1RM from 56.3 ± 28.9 to 66 ± 28.5 kg; CSG 1RM from 69.8 ± 33.5 to 77.2 ± 34.1 kg and CG 1RM from 67.8 ± 26.6 to 72.2 ± 29.1 kg), whereas only the TSG (1RM from 84.3 ± 26.8 to 114.3 ± 26.5 kg) and CSG (1RM from 100.4 ± 33.9 to 129 ± 35.1 kg) had a statistical significant improvement in Squats 1RM (P < 0.05). However, a between groups examination reveals that there were no marked differences in 1RM squat performance between TSG and CSG (P > 0.05) and both groups had a marked improvements compared to the CG (P < 0.05). On the other hand, no differences between groups were observed in Bench press 1RM. The within groups results indicate that none of the groups had any marked improvement in the distances from 0-10 m and 10-30 m except the CSG which had a notable improvement in the distance from 10-30 m (-0.07 s; P < 0.05). Furthermore, no differences in sprinting abilities were observed between groups. The results from this investigation indicate that traditional set strength training at 70% of 1RM gave close results compared to Cluster set strength training at the same intensity. However, the results indicate that the cluster set had an effect on flying time (10-30 m) indicating that the velocity at which those repetitions were performed could be the explanation factor of this this improvement.Keywords: physical performance, 1RM, pushing velocity, velocity based training
Procedia PDF Downloads 1651246 The Effect of Eight-Week Medium Intensity Interval Training and Curcumin Intake on ICMA-1 and VCAM-1 Levels in Menopausal Fat Rats
Authors: Abdolrasoul Daneshjoo, Fatemeh Akbari Ghara
Abstract:
Background and Purpose: Obesity is an increasing factor in cardiovascular disease and serum levels of cellular adhesion molecule. It plays an important role in predicting risk for coronary artery disease. The purpose of this research was to study the effect of eight weeks moderate intensity interval training and curcumin intake on ICAM-1 & VCAM-1 levels of menopausal fat rats. Materials and methods: in this study, 28 Wistar Menopausal fat rats aged 6-8 weeks with an average weight of 250-300 (gr) were randomly divided into four groups: control, curcumin supplement, moderate intensity interval training and moderate intensity interval training + curcumin supplement. (7 rats each group). The training program was planned as 8 weeks and 3 sessions per week. Each session consisted of 10 one-min sets with 50 percent intensity and the 2-minutes interval between sets in the first week. Subjects started with 14 meters per minute, and 2 (m/min) was added to increase their speed weekly until the speed of 28 (m/min) in the 8th week. Blood samples were taken 48 hours after the last training session, and ICAM-1 A and VCAM-1 levels were measured. SPSS software, one-way analysis of variance (ANOVA) and Pearson correlation coefficient were used to assess the results. Results: The results showed that eight weeks of training and taking curcumin had significant effects on ICAM-1 levels of the rats (p ≤ 0.05). However, it had no significant effect on VCAM-1 levels in menopausal obese rates (p ≥ 0.05). There was no significant correlation between the levels of ICAM-1 and VCAM-1 in eight weeks training and taking curcumin. Conclusion: Implementation of moderate intensity interval training and the use of curcumin decreased ICAM-1 significantly.Keywords: curcumin, interval training , ICMA, VCAM
Procedia PDF Downloads 1931245 Analyzing Environmental Emotive Triggers in Terrorist Propaganda
Authors: Travis Morris
Abstract:
The purpose of this study is to measure the intersection of environmental security entities in terrorist propaganda. To the best of author’s knowledge, this is the first study of its kind to examine this intersection within terrorist propaganda. Rosoka, natural language processing software and frame analysis are used to advance our understanding of how environmental frames function as emotive triggers. Violent jihadi demagogues use frames to suggest violent and non-violent solutions to their grievances. Emotive triggers are framed in a way to leverage individual and collective attitudes in psychological warfare. A comparative research design is used because of the differences and similarities that exist between two variants of violent jihadi propaganda that target western audiences. Analysis is based on salience and network text analysis, which generates violent jihadi semantic networks. Findings indicate that environmental frames are used as emotive triggers across both data sets, but also as tactical and information data points. A significant finding is that certain core environmental emotive triggers like “water,” “soil,” and “trees” are significantly salient at the aggregate level across both data sets. All environmental entities can be classified into two categories, symbolic and literal. Importantly, this research illustrates how demagogues use environmental emotive triggers in cyber space from a subcultural perspective to mobilize target audiences to their ideology and praxis. Understanding the anatomy of propaganda construction is necessary in order to generate effective counter narratives in information operations. This research advances an additional method to inform practitioners and policy makers of how environmental security and propaganda intersect.Keywords: propaganda analysis, emotive triggers environmental security, frames
Procedia PDF Downloads 1401244 An Intelligent Controller Augmented with Variable Zero Lag Compensation for Antilock Braking System
Authors: Benjamin Chijioke Agwah, Paulinus Chinaenye Eze
Abstract:
Antilock braking system (ABS) is one of the important contributions by the automobile industry, designed to ensure road safety in such way that vehicles are kept steerable and stable when during emergency braking. This paper presents a wheel slip-based intelligent controller with variable zero lag compensation for ABS. It is required to achieve a very fast perfect wheel slip tracking during hard braking condition and eliminate chattering with improved transient and steady state performance, while shortening the stopping distance using effective braking torque less than maximum allowable torque to bring a braking vehicle to a stop. The dynamic of a vehicle braking with a braking velocity of 30 ms⁻¹ on a straight line was determined and modelled in MATLAB/Simulink environment to represent a conventional ABS system without a controller. Simulation results indicated that system without a controller was not able to track desired wheel slip and the stopping distance was 135.2 m. Hence, an intelligent control based on fuzzy logic controller (FLC) was designed with a variable zero lag compensator (VZLC) added to enhance the performance of FLC control variable by eliminating steady state error, provide improve bandwidth to eliminate the effect of high frequency noise such as chattering during braking. The simulation results showed that FLC- VZLC provided fast tracking of desired wheel slip, eliminate chattering, and reduced stopping distance by 70.5% (39.92 m), 63.3% (49.59 m), 57.6% (57.35 m) and 50% (69.13 m) on dry, wet, cobblestone and snow road surface conditions respectively. Generally, the proposed system used effective braking torque that is less than the maximum allowable braking torque to achieve efficient wheel slip tracking and overall robust control performance on different road surfaces.Keywords: ABS, fuzzy logic controller, variable zero lag compensator, wheel slip tracking
Procedia PDF Downloads 1471243 Potential Ecological Risk Assessment of Selected Heavy Metals in Sediments of Tidal Flat Marsh, the Case Study: Shuangtai Estuary, China
Authors: Chang-Fa Liu, Yi-Ting Wang, Yuan Liu, Hai-Feng Wei, Lei Fang, Jin Li
Abstract:
Heavy metals in sediments can cause adverse ecological effects while it exceeds a given criteria. The present study investigated sediment environmental quality, pollutant enrichment, ecological risk, and source identification for copper, cadmium, lead, zinc, mercury, and arsenic in the sediments collected from tidal flat marsh of Shuangtai estuary, China. The arithmetic mean integrated pollution index, geometric mean integrated pollution index, fuzzy integrated pollution index, and principal component score were used to characterize sediment environmental quality; fuzzy similarity and geo-accumulation Index were used to evaluate pollutant enrichment; correlation matrix, principal component analysis, and cluster analysis were used to identify source of pollution; environmental risk index and potential ecological risk index were used to assess ecological risk. The environmental qualities of sediment are classified to very low degree of contamination or low contamination. The similar order to element background of soil in the Liaohe plain is region of Sanjiaozhou, Honghaitan, Sandaogou, Xiaohe by pollutant enrichment analysis. The source identification indicates that correlations are significantly among metals except between copper and cadmium. Cadmium, lead, zinc, mercury, and arsenic will be clustered in the same clustering as the first principal component. Copper will be clustered as second principal component. The environmental risk assessment level will be scaled to no risk in the studied area. The order of potential ecological risk is As > Cd > Hg > Cu > Pb > Zn.Keywords: ecological risk assessment, heavy metals, sediment, marsh, Shuangtai estuary
Procedia PDF Downloads 3501242 How to Use Big Data in Logistics Issues
Authors: Mehmet Akif Aslan, Mehmet Simsek, Eyup Sensoy
Abstract:
Big Data stands for today’s cutting-edge technology. As the technology becomes widespread, so does Data. Utilizing massive data sets enable companies to get competitive advantages over their adversaries. Out of many area of Big Data usage, logistics has significance role in both commercial sector and military. This paper lays out what big data is and how it is used in both military and commercial logistics.Keywords: big data, logistics, operational efficiency, risk management
Procedia PDF Downloads 6421241 Exploring Management of the Fuzzy Front End of Innovation in a Product Driven Startup Company
Authors: Dmitry K. Shaytan, Georgy D. Laptev
Abstract:
In our research we aimed to test a managerial approach for the fuzzy front end (FFE) of innovation by creating controlled experiment/ business case in a breakthrough innovation development. The experiment was in the sport industry and covered all aspects of the customer discovery stage from ideation to prototyping followed by patent application. In the paper we describe and analyze mile stones, tasks, management challenges, decisions made to create the break through innovation, evaluate overall managerial efficiency that was at the considered FFE stage. We set managerial outcome of the FFE stage as a valid product concept in hand. In our paper we introduce hypothetical construct “Q-factor” that helps us in the experiment to distinguish quality of FFE outcomes. The experiment simulated for entrepreneur the FFE of innovation and put on his shoulders responsibility for the outcome of valid product concept. While developing managerial approach to reach the outcome there was a decision to look on product concept from the cognitive psychology and cognitive science point of view. This view helped us to develop the profile of a person whose projection (mental representation) of a new product could optimize for a manager or entrepreneur FFE activities. In the experiment this profile was tested to develop breakthrough innovation for swimmers. Following the managerial approach the product concept was created to help swimmers to feel/sense water. The working prototype was developed to estimate the product concept validity and value added effect for customers. Based on feedback from coachers and swimmers there were strong positive effect that gave high value for customers, and for the experiment – the valid product concept being developed by proposed managerial approach for the FFE. In conclusions there is a suggestion of managerial approach that was derived from experiment.Keywords: concept development, concept testing, customer discovery, entrepreneurship, entrepreneurial management, idea generation, idea screening, startup management
Procedia PDF Downloads 446