Search results for: information extraction evaluation method
31360 Ultrasonic Extraction of Phenolics from Leaves of Shallots and Peels of Potatoes for Biofortification of Cheese
Authors: Lila Boulekbache-Makhlouf, Brahmi Fatiha
Abstract:
This study was carried out with the aim of enriching fresh cheese with the food by-products, which are the leaves of shallots and the peels of potatoes. Firstly, the conditions for extracting the total polyphenols (TPP) using ultrasound are optimized. Then, the contents of PPT, flavonoids, and antioxidant activity were evaluated for the extracts obtained by adopting the optimal parameter. On the other hand, we have carried out some physico-chemical, microbiological, and sensory analyzes of the cheese produced. The maximum PPT value of 70.44 mg GAE/g DM of shallot leaves was reached with 40% (v/v) ethanol, an extraction time of 90 min, and a temperature of 10°C. Meanwhile, the maximum TPP content of potato peels of 45.03 ± 4.16 mg GAE/g DM was obtained using an ethanol/water mixture (40%, v/v), a time of 30 min, and a temperature of 60°C and the flavonoid contents were 13.99 and 7.52 QE/g DM, respectively. From the antioxidant tests, we deduced that the potato peels present a higher antioxidant power with IC50s of 125.42 ± 2.78 μg/mL for DPPH, of 87.21 ± 7.72 μg/mL for phosphomolybdate and 200.77 ± 13.38 μg/mL for iron chelation, compared with the results obtained for shallot leaves which were 204.29 ± 0.09, 45.85 ± 3,46 and 1004.10 ± 145.73 μg/mL, respectively. The results of the physico-chemical analyzes have shown that the formulated cheese was compliant with standards. Microbiological analyzes show that the hygienic quality of the cheese produced was satisfactory. According to the sensory analyzes, the experts liked the cheese enriched with the powder and pieces of the leaves of the shallots.Keywords: shallots leaves, potato peels, ultrasound extraction, phenolic, cheese
Procedia PDF Downloads 18431359 Preliminary Evaluation of Passive UHF-Band RFID for Identifying Floating Objects on the Sea
Authors: Yasuhiro Sato, Kodai Noma, Kenta Sawada, Kazumasa Adachi, Yoshinori Matsuura, Saori Iwanaga
Abstract:
RFID system is used to identify objects such as passenger identification in public transportation, instead of linear or 2-dimensional barcodes. Key advantages of RFID system are to identify objects without physical contact, and to write arbitrary information into RFID tag. These advantages may help to improve maritime safety and efficiency of activity on the sea. However, utilization of RFID system for maritime scenes has not been considered. In this paper, we evaluate the availability of a generic RFID system operating on the sea. We measure RSSI between RFID tag floating on the sea and RFID antenna, and check whether a RFID reader can access a tag or not, while the distance between a floating buoy and the ship, and the angle are changed. Finally, we discuss the feasibility and the applicability of RFID system on the sea through the results of our preliminary experiment.Keywords: RFID, experimental evaluation, RSSI, maritime use
Procedia PDF Downloads 57831358 Detection and Classification of Myocardial Infarction Using New Extracted Features from Standard 12-Lead ECG Signals
Authors: Naser Safdarian, Nader Jafarnia Dabanloo
Abstract:
In this paper we used four features i.e. Q-wave integral, QRS complex integral, T-wave integral and total integral as extracted feature from normal and patient ECG signals to detection and localization of myocardial infarction (MI) in left ventricle of heart. In our research we focused on detection and localization of MI in standard ECG. We use the Q-wave integral and T-wave integral because this feature is important impression in detection of MI. We used some pattern recognition method such as Artificial Neural Network (ANN) to detect and localize the MI. Because these methods have good accuracy for classification of normal and abnormal signals. We used one type of Radial Basis Function (RBF) that called Probabilistic Neural Network (PNN) because of its nonlinearity property, and used other classifier such as k-Nearest Neighbors (KNN), Multilayer Perceptron (MLP) and Naive Bayes Classification. We used PhysioNet database as our training and test data. We reached over 80% for accuracy in test data for localization and over 95% for detection of MI. Main advantages of our method are simplicity and its good accuracy. Also we can improve accuracy of classification by adding more features in this method. A simple method based on using only four features which extracted from standard ECG is presented which has good accuracy in MI localization.Keywords: ECG signal processing, myocardial infarction, features extraction, pattern recognition
Procedia PDF Downloads 45531357 Real-Time Hybrid Simulation for a Tuned Liquid Column Damper Implementation
Authors: Carlos Riascos, Peter Thomson
Abstract:
Real-time hybrid simulation (RTHS) is a modern cyber-physical technique used for the experimental evaluation of complex systems, that treats the system components with predictable behavior as a numerical substructure and the components that are difficult to model as an experimental substructure. Therefore it is an attractive method for evaluation of the response of civil structures under earthquake, wind and anthropic loads. Another practical application of RTHS is the evaluation of control systems, as these devices are often nonlinear and their characterization is an important step in the design of controllers with the desired performance. In this paper, the response of three-story shear frame controlled by a tuned liquid column damper (TLCD) and subject to base excitation is considered. Both passive and semi-active control strategies were implemented and are compared. While the passive TLCD achieved a reduction of 50% in the acceleration response of the main structure in comparison with the structure without control, the semi-active TLCD achieved a reduction of 70%, and was robust to variations in the dynamic properties of the main structure. In addition, a RTHS was implemented with the main structure modeled as a linear, time-invariant (LTI) system through a state space representation and the TLCD, with both control strategies, was evaluated on a shake table that reproduced the displacement of the virtual structure. Current assessment measures for RTHS were used to quantify the performance with parameters such as generalized amplitude, equivalent time delay between the target and measured displacement of the shake table, and energy error using the measured force, and prove that the RTHS described in this paper is an accurate method for the experimental evaluation of structural control systems.Keywords: structural control, hybrid simulation, tuned liquid column damper, semi-active sontrol strategy
Procedia PDF Downloads 29731356 Extraction and Characterization of Kernel Oil of Acrocomia Totai
Authors: Gredson Keif Souza, Nehemias Curvelo Pereira
Abstract:
Kernel oil from Macaúba is an important source of essential fatty acids. Thus, a new knowledge of the oil of this species could be used in new applications, such as pharmaceutical drugs based in the manufacture of cosmetics, and in various industrial processes. The aim of this study was to characterize the kernel oil of macaúba (Acrocomia Totai) at different times of their maturation. The physico-chemical characteristics were determined in accordance with the official analytical methods of oils and fats. It was determined the content of water and lipids in kernel, saponification value, acid value, water content in the oil, viscosity, density, composition in fatty acids by gas chromatography and molar mass. The results submitted to Tukey test for significant value to 5%. Found for the unripe fruits values superior to unsaturated fatty acids.Keywords: extraction, characterization, kernel oil, acrocomia totai
Procedia PDF Downloads 35631355 The Effectiveness of Banks’ Web Sites: A Study of Turkish Banking Sector
Authors: Raif Parlakkaya, Huseyin Cetin, Duygu Irdiren
Abstract:
By the development of World Wide Web, the usage rate of Internet has rapidly grown globally; and provided a basis for the emergence of electronic business. As well as other sectors, the banking sector has adopted the use of internet with the developments in information and communication technologies. Due to the public disclosure and transparency principle of Corporate Governance, the importance of information disclosure of banks on their web sites has increased significantly. For the purpose of this study, a Bank Disclosure Attribute Index (BDAI) in Turkey has been constructed through classifying the information disclosure on banks’ web sites into general, financial, investors and corporate governance attributes. All 47 banks in Turkish Banking System have been evaluated according to the index with the aim of providing a comparison between banks. By Chi Square Test, Pearson Correlation, T-Test, and ANOVA statistical tools, it has been concluded that the majority of banks in Turkey have shared information on their web sites adequately with respect to their total index score. Although there is a positive correlation between various types of information on banks’ web sites, there is no uniformity among them. Also, no significant difference between various types of information disclosure and bank types has been observed. Compared with the total index score averages of the five largest banks in Turkey, there are some banks that need to improve the content of their web sites.Keywords: internet banking, websites evaluation, customer adoption, Turkey
Procedia PDF Downloads 39831354 Automatic Detection of Suicidal Behaviors Using an RGB-D Camera: Azure Kinect
Authors: Maha Jazouli
Abstract:
Suicide is one of the most important causes of death in the prison environment, both in Canada and internationally. Rates of attempts of suicide and self-harm have been on the rise in recent years, with hangings being the most frequent method resorted to. The objective of this article is to propose a method to automatically detect in real time suicidal behaviors. We present a gesture recognition system that consists of three modules: model-based movement tracking, feature extraction, and gesture recognition using machine learning algorithms (MLA). Our proposed system gives us satisfactory results. This smart video surveillance system can help assist staff responsible for the safety and health of inmates by alerting them when suicidal behavior is detected, which helps reduce mortality rates and save lives.Keywords: suicide detection, Kinect azure, RGB-D camera, SVM, machine learning, gesture recognition
Procedia PDF Downloads 18831353 Design and Development of Data Mining Application for Medical Centers in Remote Areas
Authors: Grace Omowunmi Soyebi
Abstract:
Data Mining is the extraction of information from a large database which helps in predicting a trend or behavior, thereby helping management make knowledge-driven decisions. One principal problem of most hospitals in rural areas is making use of the file management system for keeping records. A lot of time is wasted when a patient visits the hospital, probably in an emergency, and the nurse or attendant has to search through voluminous files before the patient's file can be retrieved; this may cause an unexpected to happen to the patient. This Data Mining application is to be designed using a Structured System Analysis and design method, which will help in a well-articulated analysis of the existing file management system, feasibility study, and proper documentation of the Design and Implementation of a Computerized medical record system. This Computerized system will replace the file management system and help to easily retrieve a patient's record with increased data security, access clinical records for decision-making, and reduce the time range at which a patient gets attended to.Keywords: data mining, medical record system, systems programming, computing
Procedia PDF Downloads 20931352 Supervised Learning for Cyber Threat Intelligence
Authors: Jihen Bennaceur, Wissem Zouaghi, Ali Mabrouk
Abstract:
The major aim of cyber threat intelligence (CTI) is to provide sophisticated knowledge about cybersecurity threats to ensure internal and external safeguards against modern cyberattacks. Inaccurate, incomplete, outdated, and invaluable threat intelligence is the main problem. Therefore, data analysis based on AI algorithms is one of the emergent solutions to overcome the threat of information-sharing issues. In this paper, we propose a supervised machine learning-based algorithm to improve threat information sharing by providing a sophisticated classification of cyber threats and data. Extensive simulations investigate the accuracy, precision, recall, f1-score, and support overall to validate the designed algorithm and to compare it with several supervised machine learning algorithms.Keywords: threat information sharing, supervised learning, data classification, performance evaluation
Procedia PDF Downloads 14831351 Secured Embedding of Patient’s Confidential Data in Electrocardiogram Using Chaotic Maps
Authors: Butta Singh
Abstract:
This paper presents a chaotic map based approach for secured embedding of patient’s confidential data in electrocardiogram (ECG) signal. The chaotic map generates predefined locations through the use of selective control parameters. The sample value difference method effectually hides the confidential data in ECG sample pairs at these predefined locations. Evaluation of proposed method on all 48 records of MIT-BIH arrhythmia ECG database demonstrates that the embedding does not alter the diagnostic features of cover ECG. The secret data imperceptibility in stego-ECG is evident through various statistical and clinical performance measures. Statistical metrics comprise of Percentage Root Mean Square Difference (PRD) and Peak Signal to Noise Ratio (PSNR). Further, a comparative analysis between proposed method and existing approaches was also performed. The results clearly demonstrated the superiority of proposed method.Keywords: chaotic maps, ECG steganography, data embedding, electrocardiogram
Procedia PDF Downloads 19531350 A Modeling Approach for Blockchain-Oriented Information Systems Design
Abstract:
The blockchain technology is regarded as the most promising technology that has the potential to trigger a technological revolution. However, besides the bitcoin industry, we have not yet seen a large-scale application of blockchain in those domains that are supposed to be impacted, such as supply chain, financial network, and intelligent manufacturing. The reasons not only lie in the difficulties of blockchain implementation, but are also root in the challenges of blockchain-oriented information systems design. As the blockchain members are self-interest actors that belong to organizations with different existing information systems. As they expect different information inputs and outputs of the blockchain application, a common language protocol is needed to facilitate communications between blockchain members. Second, considering the decentralization of blockchain organization, there is not any central authority to organize and coordinate the business processes. Thus, the information systems built on blockchain should support more adaptive business process. This paper aims to address these difficulties by providing a modeling approach for blockchain-oriented information systems design. We will investigate the information structure of distributed-ledger data with conceptual modeling techniques and ontology theories, and build an effective ontology mapping method for the inter-organization information flow and blockchain information records. Further, we will study the distributed-ledger-ontology based business process modeling to support adaptive enterprise on blockchain.Keywords: blockchain, ontology, information systems modeling, business process
Procedia PDF Downloads 44931349 An Evaluation and Guidance for mHealth Apps
Authors: Tareq Aljaber
Abstract:
The number of mobile health apps is growing at a fast frequency as it's nearly doubled in a year between 2015 and 2016. Though, there is a lack of an effective evaluation framework to verify the usability and reliability of mobile phone health education applications which would help saving time and effort for the numerous user groups. This abstract describing a framework for evaluating mobile applications in specifically mobile health education applications, along with a guidance select tool to assist different users to select the most suitable mobile health education apps. The effective framework outcome is intended to meet the requirements and needs of the different stakeholder groups additionally to enhancing the development of mobile health education applications with software engineering approaches, by producing new and more effective techniques to evaluate such software. This abstract highlights the significance and consequences of mobile health education apps, before focusing the light on the required to create an effective evaluation framework for these apps. An explanation of the effective evaluation framework is going to be delivered in the abstract, beside with some specific evaluation metrics: an efficient hybrid of selected heuristic evaluation (HE) and usability evaluation (UE) metrics to enable the determination of the usefulness and usability of health education mobile apps. Moreover, an explanation of the qualitative and quantitative outcomes for the effective evaluation framework was accomplished using Epocrates mobile phone app in addition to some other mobile phone apps. This proposed framework-An Evaluation Framework for Mobile Health Education Apps-consists of a hybrid of 5 metrics designated from a larger set in usability evaluation and heuristic evaluation, illuminated grounded on 15 unstructured interviews from software developers (SD), health professionals (HP) and patients (P). These five metrics corresponding to explicit facets of usability recognised through a requirements analysis of typical stakeholders of mobile health apps. These five hybrid selected metrics were scattered across 24 specific questionnaire questions, which are available on request from first author. This questionnaire has been sent to 81 participants distributed in three sets of stakeholders from software developers (SD), health professionals (HP) and patients/general users (P/GU) on the purpose of ranking three sets of mobile health education applications. Finally, the outcomes from the questionnaire data helped us to approach our aims which are finding the profile for different stakeholders, finding the profile for different mobile health educations application packages, ranking different mobile health education application and guide us to build the select guidance too which is apart from the Evaluation Framework for Mobile Health Education Apps.Keywords: evaluation framework, heuristic evaluation, usability evaluation, metrics
Procedia PDF Downloads 40331348 The Effect of Information Technology on the Quality of Accounting Information
Authors: Mohammad Hadi Khorashadi Zadeh, Amin Karkon, Hamid Golnari
Abstract:
This study aimed to investigate the impact of information technology on the quality of accounting information was made in 2014. A survey of 425 executives of listed companies in Tehran Stock Exchange, using the Cochran formula simple random sampling method, 84 managers of these companies as the sample size was considered. Methods of data collection based on questionnaire information technology some of the questions of the impact of information technology was standardized questionnaires and the questions were designed according to existing components. After the distribution and collection of questionnaires, data analysis and hypothesis testing using structural equation modeling Smart PLS2 and software measurement model and the structure was conducted in two parts. In the first part of the questionnaire technical characteristics including reliability, validity, convergent and divergent validity for PLS has been checked and in the second part, application no significant coefficients were used to examine the research hypotheses. The results showed that IT and its dimensions (timeliness, relevance, accuracy, adequacy, and the actual transfer rate) affect the quality of accounting information of listed companies in Tehran Stock Exchange influence.Keywords: information technology, information quality, accounting, transfer speed
Procedia PDF Downloads 27731347 Empirical Decomposition of Time Series of Power Consumption
Authors: Noura Al Akkari, Aurélie Foucquier, Sylvain Lespinats
Abstract:
Load monitoring is a management process for energy consumption towards energy savings and energy efficiency. Non Intrusive Load Monitoring (NILM) is one method of load monitoring used for disaggregation purposes. NILM is a technique for identifying individual appliances based on the analysis of the whole residence data retrieved from the main power meter of the house. Our NILM framework starts with data acquisition, followed by data preprocessing, then event detection, feature extraction, then general appliance modeling and identification at the final stage. The event detection stage is a core component of NILM process since event detection techniques lead to the extraction of appliance features. Appliance features are required for the accurate identification of the household devices. In this research work, we aim at developing a new event detection methodology with accurate load disaggregation to extract appliance features. Time-domain features extracted are used for tuning general appliance models for appliance identification and classification steps. We use unsupervised algorithms such as Dynamic Time Warping (DTW). The proposed method relies on detecting areas of operation of each residential appliance based on the power demand. Then, detecting the time at which each selected appliance changes its states. In order to fit with practical existing smart meters capabilities, we work on low sampling data with a frequency of (1/60) Hz. The data is simulated on Load Profile Generator software (LPG), which was not previously taken into consideration for NILM purposes in the literature. LPG is a numerical software that uses behaviour simulation of people inside the house to generate residential energy consumption data. The proposed event detection method targets low consumption loads that are difficult to detect. Also, it facilitates the extraction of specific features used for general appliance modeling. In addition to this, the identification process includes unsupervised techniques such as DTW. To our best knowledge, there exist few unsupervised techniques employed with low sampling data in comparison to the many supervised techniques used for such cases. We extract a power interval at which falls the operation of the selected appliance along with a time vector for the values delimiting the state transitions of the appliance. After this, appliance signatures are formed from extracted power, geometrical and statistical features. Afterwards, those formed signatures are used to tune general model types for appliances identification using unsupervised algorithms. This method is evaluated using both simulated data on LPG and real-time Reference Energy Disaggregation Dataset (REDD). For that, we compute performance metrics using confusion matrix based metrics, considering accuracy, precision, recall and error-rate. The performance analysis of our methodology is then compared with other detection techniques previously used in the literature review, such as detection techniques based on statistical variations and abrupt changes (Variance Sliding Window and Cumulative Sum).Keywords: general appliance model, non intrusive load monitoring, events detection, unsupervised techniques;
Procedia PDF Downloads 8231346 Electromagnetically-Vibrated Solid-Phase Microextraction for Organic Compounds
Authors: Soo Hyung Park, Seong Beom Kim, Wontae Lee, Jin Chul Joo, Jungmin Lee, Jongsoo Choi
Abstract:
A newly-developed electromagnetically vibrated solid-phase microextraction (SPME) device for extracting nonpolar organic compounds from aqueous matrices was evaluated in terms of sorption equilibrium time, precision, and detection level relative to three other more conventional extraction techniques involving SPME, viz., static, magnetic stirring, and fiber insertion/retraction. Electromagnetic vibration at 300~420 cycles/s was found to be the most efficient extraction technique in terms of reducing sorption equilibrium time and enhancing both precision and linearity. The increased efficiency for electromagnetic vibration was attributed to a greater reduction in the thickness of the stagnant-water layer that facilitated more rapid mass transport from the aqueous matrix to the SPME fiber. Electromagnetic vibration less than 500 cycles/s also did not detrimentally impact the sustainability of the extracting performance of the SPME fiber. Therefore, electromagnetically vibrated SPME may be a more powerful tool for rapid sampling and solvent-free sample preparation relative to other more conventional extraction techniques used with SPME.Keywords: electromagnetic vibration, organic compounds, precision, solid-phase microextraction (SPME), sorption equilibrium time
Procedia PDF Downloads 25431345 Drug and Poison Information Centers: An Emergent Need of Health Care Professionals in Pakistan
Authors: Asif Khaliq, Sayeeda A. Sayed
Abstract:
The drug information centers provide drug related information to the requesters that include physicians, pharmacist, nurses and other allied health care professionals. The International Pharmacist Federation (FIP) describes basic functions of a drug and poison information centers as drug evaluation, therapeutic counseling, pharmaceutical advice, research, pharmaco-vigilence and toxicology. Continuous advancement in the field of medicine has expanded the medical literature, which has increased demand of a drug and poison information center for the guidance, support and facilitation of physicians. The objective of the study is to determine the need of drug and poison information centers in public and private hospitals of Karachi, Pakistan. A cross sectional study was conducted during July 2013 to April 2014 using a self-administered, multi-itemed questionnaire. Non Probability Convenient sampling was used to select the study participants. A total of 307 physicians from public and private hospitals of Karachi participated in the study. The need for 24/7 Drug and poison information center was highlighted by 92 % of physicians and 67% physicians suggested opening a drug information center at the hospital. It was reported that 70% physicians take at least 15 minutes for searching the information about the drug while managing a case. Regarding the poisoning case management, 52% physicians complaint about the unavailability of medicines in hospitals; and mentioned the importance of medicines for safe and timely management of patients. Although 73% physicians attended continued medical education (CME) sessions, 92 % physicians insisted on the need of 24/7 Drug and poison information center. The scarcity of organized channel for obtaining the information about drug and poisons is one of the most crucial problems for healthcare workers in Pakistan. The drug and poison information center is an advisory body that assists health care professional and patients in provision of appropriate drug and hazardous substance information. Drug and poison information center is one of the integral needs for running an effective health care system. Provision of a 24 /7 drug information centers with specialized staff offer multiple benefits to the hospitals while reducing treatment delays, addressing awareness gaps of all stakeholders and ensuring provision of quality health care.Keywords: drug and poison information centers, Pakistan, physicians, public and private hospitals
Procedia PDF Downloads 32731344 Traffic Prediction with Raw Data Utilization and Context Building
Authors: Zhou Yang, Heli Sun, Jianbin Huang, Jizhong Zhao, Shaojie Qiao
Abstract:
Traffic prediction is essential in a multitude of ways in modern urban life. The researchers of earlier work in this domain carry out the investigation chiefly with two major focuses: (1) the accurate forecast of future values in multiple time series and (2) knowledge extraction from spatial-temporal correlations. However, two key considerations for traffic prediction are often missed: the completeness of raw data and the full context of the prediction timestamp. Concentrating on the two drawbacks of earlier work, we devise an approach that can address these issues in a two-phase framework. First, we utilize the raw trajectories to a greater extent through building a VLA table and data compression. We obtain the intra-trajectory features with graph-based encoding and the intertrajectory ones with a grid-based model and the technique of back projection that restore their surrounding high-resolution spatial-temporal environment. To the best of our knowledge, we are the first to study direct feature extraction from raw trajectories for traffic prediction and attempt the use of raw data with the least degree of reduction. In the prediction phase, we provide a broader context for the prediction timestamp by taking into account the information that are around it in the training dataset. Extensive experiments on several well-known datasets have verified the effectiveness of our solution that combines the strength of raw trajectory data and prediction context. In terms of performance, our approach surpasses several state-of-the-art methods for traffic prediction.Keywords: traffic prediction, raw data utilization, context building, data reduction
Procedia PDF Downloads 12731343 Experimental Evaluation of Succinct Ternary Tree
Authors: Dmitriy Kuptsov
Abstract:
Tree data structures, such as binary or in general k-ary trees, are essential in computer science. The applications of these data structures can range from data search and retrieval to sorting and ranking algorithms. Naive implementations of these data structures can consume prohibitively large volumes of random access memory limiting their applicability in certain solutions. Thus, in these cases, more advanced representation of these data structures is essential. In this paper we present the design of the compact version of ternary tree data structure and demonstrate the results for the experimental evaluation using static dictionary problem. We compare these results with the results for binary and regular ternary trees. The conducted evaluation study shows that our design, in the best case, consumes up to 12 times less memory (for the dictionary used in our experimental evaluation) than a regular ternary tree and in certain configuration shows performance comparable to regular ternary trees. We have evaluated the performance of the algorithms using both 32 and 64 bit operating systems.Keywords: algorithms, data structures, succinct ternary tree, per- formance evaluation
Procedia PDF Downloads 16031342 Evaluation and Selection of Contractors in Construction Projects with a View Supply Chain Management and Utilization of Promthee
Authors: Sara Najiazarpour, Mahsa Najiazarpour
Abstract:
There are many problems in contracting projects and their performance. At each project stage and due to different reasons, these problems affect cost, time and overall project quality. Hence, in order to increase the efficiency and performance in all levels of the chain and with supply chain management approach, there will be a coordination from the beginning of a project (contractor selection) to the end of project (handover of project). Contractor selection is the foremost part of construction projects which in this multi-criteria decision-making, the best contractor is determined by expert judgment, different variables and their priorities. In this paper for selecting the best contractor, numerous criteria were collected by asking from adept experts and then among them, 16 criteria with highest frequency were considered for questionnaire. This questionnaire was distributed between experts. Cronbach's alpha coefficient was obtained as 72%. Then based on Borda's function 12 important criteria was selected which was categorized in four main criteria and related sub-criteria as follow: Environmental factors and physical equipment: procurement and materials (supplier), company's machines, contractor’s proposed cost estimate - financial capacity: bank turnover and company's assets, the income of tax declaration in last year, Ability to compensate for losses or delays - past performance- records and technical expertise: experts and key personnel, the past technical backgrounds and experiences, employer satisfaction of previous contracts, the number of similar projects was done - standards: rank and field of expertise which company is qualified for and its validity, availability and number of permitted projects done. Then with PROMTHEE method, the criteria were normalized and monitored, finally the best alternative was selected. In this research, qualitative criteria of each company is became a quantitative criteria. Finally, information of some companies was evaluated and the best contractor was selected based on all criteria and their priorities.Keywords: contractor evaluation and selection, project development, supply chain management, PROMTHEE method
Procedia PDF Downloads 7231341 An Experiential Learning of Ontology-Based Multi-document Summarization by Removal Summarization Techniques
Authors: Pranjali Avinash Yadav-Deshmukh
Abstract:
Remarkable development of the Internet along with the new technological innovation, such as high-speed systems and affordable large storage space have led to a tremendous increase in the amount and accessibility to digital records. For any person, studying of all these data is tremendously time intensive, so there is a great need to access effective multi-document summarization (MDS) systems, which can successfully reduce details found in several records into a short, understandable summary or conclusion. For semantic representation of textual details in ontology area, as a theoretical design, our system provides a significant structure. The stability of using the ontology in fixing multi-document summarization problems in the sector of catastrophe control is finding its recommended design. Saliency ranking is usually allocated to each phrase and phrases are rated according to the ranking, then the top rated phrases are chosen as the conclusion. With regards to the conclusion quality, wide tests on a selection of media announcements are appropriate for “Jammu Kashmir Overflow in 2014” records. Ontology centered multi-document summarization methods using “NLP centered extraction” outshine other baselines. Our participation in recommended component is to implement the details removal methods (NLP) to enhance the results.Keywords: disaster management, extraction technique, k-means, multi-document summarization, NLP, ontology, sentence extraction
Procedia PDF Downloads 38631340 Satisfaction Evaluation on the Fundamental Public Services for a Large-Scale Indemnificatory Residential Community: A Case Study of Nanjing
Authors: Dezhi Li, Peng Cui, Bo Zhang, Tengyuan Chang
Abstract:
In order to solve the housing problem for the low-income families, the construction of affordable housing is booming in China. However, due to various reasons, the service facilities and systems in the indemnificatory residential community meet many problems. This article established a Satisfaction Evaluation System of the Fundamental Public Services for Large-scale Indemnificatory Residential Community based on the national standards and local criteria and developed evaluation methods and processes. At last, in the case of Huagang project in Nanjing, the satisfaction of basic public service is calculated according to a survey of local residents.Keywords: indemnificatory residential community, public services, satisfaction evaluation, structural equation modeling
Procedia PDF Downloads 36231339 Determination of Complexity Level in Okike's Merged Irregular Transposition Cipher
Authors: Okike Benjami, Garba Ejd
Abstract:
Today, it has been observed security of information along the superhighway is often compromised by those who are not authorized to have access to such information. In other to ensure the security of information along the superhighway, such information should be encrypted by some means to conceal the real meaning of the information. There are many encryption techniques out there in the market. However, some of these encryption techniques are often decrypted by adversaries with ease. The researcher has decided to develop an encryption technique that may be more difficult to decrypt. This may be achieved by splitting the message to be encrypted into parts and encrypting each part separately and swapping the positions before transmitting the message along the superhighway. The method is termed Okike’s Merged Irregular Transposition Cipher. Also, the research would determine the complexity level in respect to the number of splits of the message.Keywords: transposition cipher, merged irregular cipher, encryption, complexity level
Procedia PDF Downloads 28931338 LGG Architecture for Brain Tumor Segmentation Using Convolutional Neural Network
Authors: Sajeeha Ansar, Asad Ali Safi, Sheikh Ziauddin, Ahmad R. Shahid, Faraz Ahsan
Abstract:
The most aggressive form of brain tumor is called glioma. Glioma is kind of tumor that arises from glial tissue of the brain and occurs quite often. A fully automatic 2D-CNN model for brain tumor segmentation is presented in this paper. We performed pre-processing steps to remove noise and intensity variances using N4ITK and standard intensity correction, respectively. We used Keras open-source library with Theano as backend for fast implementation of CNN model. In addition, we used BRATS 2015 MRI dataset to evaluate our proposed model. Furthermore, we have used SimpleITK open-source library in our proposed model to analyze images. Moreover, we have extracted random 2D patches for proposed 2D-CNN model for efficient brain segmentation. Extracting 2D patched instead of 3D due to less dimensional information present in 2D which helps us in reducing computational time. Dice Similarity Coefficient (DSC) is used as performance measure for the evaluation of the proposed method. Our method achieved DSC score of 0.77 for complete, 0.76 for core, 0.77 for enhanced tumor regions. However, these results are comparable with methods already implemented 2D CNN architecture.Keywords: brain tumor segmentation, convolutional neural networks, deep learning, LGG
Procedia PDF Downloads 18231337 Investigation of the Physicochemistry in Leaching of Blackmass for the Recovery of Metals from Spent Lithium-Ion Battery
Authors: Alexandre Chagnes
Abstract:
Lithium-ion battery is the technology of choice in the development of electric vehicles. This technology is now mature, although there are still many challenges to increase their energy density while ensuring an irreproachable safety of use. For this goal, it is necessary to develop new cathodic materials that can be cycled at higher voltages and electrolytes compatible with these materials. But the challenge does not only concern the production of efficient batteries for the electrochemical storage of energy since lithium-ion battery technology relies on the use of critical and/or strategic value resources. It is, therefore, crucial to include Lithium-ion batteries development in a circular economy approach very early. In particular, optimized recycling and reuse of battery components must both minimize their impact on the environment and limit geopolitical issues related to tensions on the mineral resources necessary for lithium-ion battery production. Although recycling will never replace mining, it reduces resource dependence by ensuring the presence of exploitable resources in the territory, which is particularly important for countries like France, where exploited or exploitable resources are limited. This conference addresses the development of a new hydrometallurgical process combining leaching of cathodic material from spent lithium-ion battery in acidic chloride media and solvent extraction process. Most of recycling processes reported in the literature rely on the sulphate route, and a few studies investigate the potentialities of the chloride route despite many advantages and the possibility to develop new chemistry, which could get easier the metal separation. The leaching mechanisms and the solvent extraction equilibria will be presented in this conference. Based on the comprehension of the physicochemistry of leaching and solvent extraction, the present study will introduce a new hydrometallurgical process for the production of cobalt, nickel, manganese and lithium from spent cathodic materials.Keywords: lithium-ion battery, recycling, hydrometallurgy, leaching, solvent extraction
Procedia PDF Downloads 8031336 Development of Standard Thai Appetizer in Rattanakosin Era‘s Standard: Case Study of Thai Steamed Dumpling
Authors: Nunyong Fuengkajornfung, Pattama Hirunyophat, Tidarat Sanphom
Abstract:
The objectives of this research were: To study of the recipe standard of Thai steamed dumpling, to study the ratio of modified starch in Thai steamed dumpling, to study chemical elements analyzing and Escherichia coli in Thai steamed dumpling. The experimental processes were designed in two stages as follows: To study the recipe standard of Thai steamed dumpling and to study the ratio of rice flour: modify starch by three levels 90:10, 73:30, and 50:50. The evaluation test used 9 Points Hedonic Scale method by the sensory evaluation test such as color, smell, taste, texture and overall liking. An experimental by Randomized Complete Block Design (RCBD). The statistics used in data analyses were means, standard deviation, one-way ANOVA and Duncan’s New Multiple Range Test. Regression equation, at a statistically significant level of .05. The results showed that the recipe standard was studied from three recipes by the sensory evaluation test such as color, odor, taste, spicy, texture and total acceptance. The result showed that the recipe standard of second was suitably to development. The ratio of rice flour: modified starch had 3 levels 90:10, 73:30, and 50:50 which the process condition of 50:50 had well scores (like moderately to like very much; used 9 Points Hedonic Scale method for the sensory test). Chemical elements analyzing, it showed that moisture 58.63%, fat 5.45%, protein 4.35%, carbohydrate 30.45%, and Ash 1.12%. The Escherichia coli is not found in lab testing.Keywords: Thai snack in Rattanakosin era, Thai steamed dumpling, modify starch, recipe standard
Procedia PDF Downloads 32431335 Guidance for Strengthening Ethics of Entrepreneurs in Information and Communication Technology Professional
Authors: Routsukol Sunalai
Abstract:
The objectives of this paper were to study current problem of ethics of entrepreneurs in information and communication technology professional, and to build their awareness of ethics, which would be useful as guidance for strengthening professional ethics among them. The study employed quantitative research method in order to analyze relationships or differences found in each ethics factor and report in statistics. The sample of this paper was 300 information technology users of Rajabhat Universities in Bangkok. The findings revealed that the ethics factors which gained the highest and high level of opinion included possessing principles of righteousness, having trust in themselves and others, and respecting different opinions of others and accepting the fact that people of different opinions.Keywords: communication, ethics, information, entrepreneurs
Procedia PDF Downloads 41131334 Choice Experiment Approach on Evaluation of Non-Market Farming System Outputs: First Results from Lithuanian Case Study
Authors: A. Novikova, L. Rocchi, G. Startiene
Abstract:
Market and non-market outputs are produced jointly in agriculture. Their supply depends on the intensity and type of production. The role of agriculture as an economic activity and its effects are important for the Lithuanian case study, as agricultural land covers more than a half of country. Positive and negative externalities, created in agriculture are not considered in the market. Therefore, specific techniques such as stated preferences methods, in particular choice experiments (CE) are used for evaluation of non-market outputs in agriculture. The main aim of this paper is to present construction of the research path for evaluation of non-market farming system outputs in Lithuania. The conventional and organic farming, covering crops (including both cereal and industrial crops) and livestock (including dairy and cattle) production has been selected. The CE method and nested logit (NL) model were selected as appropriate for evaluation of non-market outputs of different farming systems in Lithuania. A pilot survey was implemented between October–November 2018, in order to test and improve the CE questionnaire. The results of the survey showed that the questionnaire is accepted and well understood by the respondents. The econometric modelling showed that the selected NL model could be used for the main survey. The understanding of the differences between organic and conventional farming by residents was identified. It was revealed that they are more willing to choose organic farming in comparison to conventional farming.Keywords: choice experiments, farming system, Lithuania market outputs, non-market outputs
Procedia PDF Downloads 12931333 The Analysis of Defects Prediction in Injection Molding
Authors: Mehdi Moayyedian, Kazem Abhary, Romeo Marian
Abstract:
This paper presents an evaluation of a plastic defect in injection molding before it occurs in the process; it is known as the short shot defect. The evaluation of different parameters which affect the possibility of short shot defect is the aim of this paper. The analysis of short shot possibility is conducted via SolidWorks Plastics and Taguchi method to determine the most significant parameters. Finite Element Method (FEM) is employed to analyze two circular flat polypropylene plates of 1 mm thickness. Filling time, part cooling time, pressure holding time, melt temperature and gate type are chosen as process and geometric parameters, respectively. A methodology is presented herein to predict the possibility of the short-shot occurrence. The analysis determined melt temperature is the most influential parameter affecting the possibility of short shot defect with a contribution of 74.25%, and filling time with a contribution of 22%, followed by gate type with a contribution of 3.69%. It was also determined the optimum level of each parameter leading to a reduction in the possibility of short shot are gate type at level 1, filling time at level 3 and melt temperature at level 3. Finally, the most significant parameters affecting the possibility of short shot were determined to be melt temperature, filling time, and gate type.Keywords: injection molding, plastic defects, short shot, Taguchi method
Procedia PDF Downloads 21831332 Management Information System to Help Managers for Providing Decision Making in an Organization
Authors: Ajayi Oluwasola Felix
Abstract:
Management information system (MIS) provides information for the managerial activities in an organization. The main purpose of this research is, MIS provides accurate and timely information necessary to facilitate the decision-making process and enable the organizations planning control and operational functions to be carried out effectively. Management information system (MIS) is basically concerned with processing data into information and is then communicated to the various departments in an organization for appropriate decision-making. MIS is a subset of the overall planning and control activities covering the application of humans technologies, and procedures of the organization. The information system is the mechanism to ensure that information is available to the managers in the form they want it and when they need it.Keywords: Management Information Systems (MIS), information technology, decision-making, MIS in Organizations
Procedia PDF Downloads 55631331 Selection of Relevant Servers in Distributed Information Retrieval System
Authors: Benhamouda Sara, Guezouli Larbi
Abstract:
Nowadays, the dissemination of information touches the distributed world, where selecting the relevant servers to a user request is an important problem in distributed information retrieval. During the last decade, several research studies on this issue have been launched to find optimal solutions and many approaches of collection selection have been proposed. In this paper, we propose a new collection selection approach that takes into consideration the number of documents in a collection that contains terms of the query and the weights of those terms in these documents. We tested our method and our studies show that this technique can compete with other state-of-the-art algorithms that we choose to test the performance of our approach.Keywords: distributed information retrieval, relevance, server selection, collection selection
Procedia PDF Downloads 312