Search results for: pharmaceutical procurement supplier selection criteria
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 5623

Search results for: pharmaceutical procurement supplier selection criteria

4903 From Type-I to Type-II Fuzzy System Modeling for Diagnosis of Hepatitis

Authors: Shahabeddin Sotudian, M. H. Fazel Zarandi, I. B. Turksen

Abstract:

Hepatitis is one of the most common and dangerous diseases that affects humankind, and exposes millions of people to serious health risks every year. Diagnosis of Hepatitis has always been a challenge for physicians. This paper presents an effective method for diagnosis of hepatitis based on interval Type-II fuzzy. This proposed system includes three steps: pre-processing (feature selection), Type-I and Type-II fuzzy classification, and system evaluation. KNN-FD feature selection is used as the preprocessing step in order to exclude irrelevant features and to improve classification performance and efficiency in generating the classification model. In the fuzzy classification step, an “indirect approach” is used for fuzzy system modeling by implementing the exponential compactness and separation index for determining the number of rules in the fuzzy clustering approach. Therefore, we first proposed a Type-I fuzzy system that had an accuracy of approximately 90.9%. In the proposed system, the process of diagnosis faces vagueness and uncertainty in the final decision. Thus, the imprecise knowledge was managed by using interval Type-II fuzzy logic. The results that were obtained show that interval Type-II fuzzy has the ability to diagnose hepatitis with an average accuracy of 93.94%. The classification accuracy obtained is the highest one reached thus far. The aforementioned rate of accuracy demonstrates that the Type-II fuzzy system has a better performance in comparison to Type-I and indicates a higher capability of Type-II fuzzy system for modeling uncertainty.

Keywords: hepatitis disease, medical diagnosis, type-I fuzzy logic, type-II fuzzy logic, feature selection

Procedia PDF Downloads 307
4902 The Role of Cyfra 21-1 in Diagnosing Non Small Cell Lung Cancer (NSCLC)

Authors: H. J. T. Kevin Mozes, Dyah Purnamasari

Abstract:

Background: Lung cancer accounted for the fourth most common cancer in Indonesia. 85% of all lung cancer cases are the Non-Small Cell Lung Cancer (NSCLC). The indistinct signs and symptoms of NSCLC sometimes lead to misdiagnosis. The gold standard assessment for the diagnosis of NSCLC is the histopathological biopsy, which is invasive. Cyfra 21-1 is a tumor marker, which can be found in the intermediate protein structure in the epitel. The accuracy of Cyfra 21-1 in diagnosing NSCLC is not yet known, so this report is made to seek the answer for the question above. Methods: Literature searching is done using online databases. Proquest and Pubmed are online databases being used in this report. Then, literature selection is done by excluding and including based on inclusion criterias and exclusion criterias. The selected literature is then being appraised using the criteria of validity, importance, and validity. Results: From six journals appraised, five of them are valid. Sensitivity value acquired from all five literature is ranging from 50-84.5 %, meanwhile the specificity is 87.8 %-94.4 %. Likelihood the ratio of all appraised literature is ranging from 5.09 -10.54, which categorized to Intermediate High. Conclusion: Serum Cyfra 21-1 is a sensitive and very specific tumor marker for diagnosis of non-small cell lung cancer (NSCLC).

Keywords: cyfra 21-1, diagnosis, nonsmall cell lung cancer, NSCLC, tumor marker

Procedia PDF Downloads 232
4901 Dams Operation Management Criteria during Floods: Case Study of Dez Dam in Southwest Iran

Authors: Ali Heidari

Abstract:

This paper presents the principles for improving flood mitigation operation in multipurpose dams and maximizing reservoir performance during flood occurrence with a focus on the real-time operation of gated spillways. The criteria of operation include the safety of dams during flood management, minimizing the downstream flood risk by decreasing the flood hazard and fulfilling water supply and other purposes of the dam operation in mid and long terms horizons. The parameters deemed to be important include flood inflow, outlet capacity restrictions, downstream flood inundation damages, economic revenue of dam operation, and environmental and sedimentation restrictions. A simulation model was used to determine the real-time release of the Dez dam located in the Dez rivers in southwest Iran, considering the gate regulation curves for the gated spillway. The results of the simulation model show that there is a possibility to improve the current procedures used in the real-time operation of the dams, particularly using gate regulation curves and early flood forecasting system results. The Dez dam operation data shows that in one of the best flood control records, % 17 of the total active volume and flood control pool of the reservoir have not been used in decreasing the downstream flood hazard despite the availability of a flood forecasting system.

Keywords: dam operation, flood control criteria, Dez dam, Iran

Procedia PDF Downloads 227
4900 Syntactic Analyzer for Tamil Language

Authors: Franklin Thambi Jose.S

Abstract:

Computational Linguistics is a branch of linguistics, which deals with the computer and linguistic levels. It is also said, as a branch of language studies which applies computer techniques to linguistics field. In Computational Linguistics, Natural Language Processing plays an important role. This came to exist because of the invention of Information Technology. In computational syntax, the syntactic analyser breaks a sentence into phrases and clauses and identifies the sentence with the syntactic information. Tamil is one of the major Dravidian languages, which has a very long written history of more than 2000 years. It is mainly spoken in Tamilnadu (in India), Srilanka, Malaysia and Singapore. It is an official language in Tamilnadu (in India), Srilanka, Malaysia and Singapore. In Malaysia Tamil speaking people are considered as an ethnic group. In Tamil syntax, the sentences in Tamil are classified into four for this research, namely: 1. Main Sentence 2. Interrogative Sentence 3. Equational Sentence 4. Elliptical Sentence. In computational syntax, the first step is to provide required information regarding the head and its constituent of each sentence. This information will be incorporated to the system using programming languages. Now the system can easily analyse a given sentence with the criteria or mechanisms given to it. Providing needful criteria or mechanisms to the computer to identify the basic types of sentences using Syntactic parser in Tamil language is the major objective of this paper.

Keywords: tamil, syntax, criteria, sentences, parser

Procedia PDF Downloads 517
4899 An Analysis of Possible Implications of Patent Term Extension in Pharmaceutical Sector on Indian Consumers

Authors: Anandkumar Rshindhe

Abstract:

Patents are considered as good monopoly in India. It is a mechanism by which the inventor is encouraged to do invention and also to make available to the society at large with a new useful technology. Patent system does not provide any protection to the invention itself but to the claims (rights) which the patentee has identified in relation to his invention. Thus the patentee is granted monopoly to the extent of his recognition of his own rights in the form of utilities and all other utilities of invention are for the public. Thus we find both benefit to the inventor and the public at large that is the ultimate consumer. But developing any such technology is not free of cost. Inventors do a lot of investment in the coming out with a new technologies. One such example if of Pharmaceutical industries. These pharmaceutical Industries do lot of research and invest lot of money, time and labour in coming out with these invention. Once invention is done or process identified, in order to protect it, inventors approach Patent system to protect their rights in the form of claim over invention. The patent system takes its own time in giving recognition to the invention as patent. Even after the grant of patent the pharmaceutical companies need to comply with many other legal formalities to launch it as a drug (medicine) in market. Thus major portion in patent term is unproductive to patentee and whatever limited period the patentee gets would be not sufficient to recover the cost involved in invention and as a result price of patented product is raised very much, just to recover the cost of invent. This is ultimately a burden on consumer who is paying more only because the legislature has failed to provide for the delay and loss caused to patentee. This problem can be effectively remedied if Patent Term extension is done. Due to patent term extension, the inventor gets some more time in recovering the cost of invention. Thus the end product is much more cheaper compared to non patent term extension.The basic question here arises is that when the patent period granted to a patentee is only 20 years and out of which a major portion is spent in complying with necessary legal formalities before making the medicine available in market, does the company with the limited period of monopoly recover its investment made for doing research. Further the Indian patent Act has certain provisions making it mandatory on the part of patentee to make its patented invention at reasonable affordable price in India. In the light of above questions whether extending the term of patent would be a proper solution and a necessary requirement to protect the interest of patentee as well as the ultimate consumer. The basic objective of this paper would be to check the implications of Extending the Patent term on Indian Consumers. Whether it provides the benefits to the patentee, consumer or a hardship to the Generic industry and consumer.

Keywords: patent term extention, consumer interest, generic drug industry, pharmaceutical industries

Procedia PDF Downloads 453
4898 The Effect of Foundation on the Earth Fill Dam Settlement

Authors: Masoud Ghaemi, Mohammadjafar Hedayati, Faezeh Yousefzadeh, Hoseinali Heydarzadeh

Abstract:

Careful monitoring in the earth dams to measure deformation caused by settlement and movement has always been a concern for engineers in the field. In order to measure settlement and deformation of earth dams, usually, the precision instruments of settlement set and combined Inclinometer that is commonly referred to IS instrument will be used. In some dams, because the thickness of alluvium is high and there is no possibility of alluvium removal (technically and economically and in terms of performance), there is no possibility of placing the end of IS instrument (precision instruments of Inclinometer-settlement set) in the rock foundation. Inevitably, have to accept installing pipes in the weak and deformable alluvial foundation that leads to errors in the calculation of the actual settlement (absolute settlement) in different parts of the dam body. The purpose of this paper is to present new and refine criteria for predicting settlement and deformation in earth dams. The study is based on conditions in three dams with a deformation quite alluvial (Agh Chai, Narmashir and Gilan-e Gharb) to provide settlement criteria affected by the alluvial foundation. To achieve this goal, the settlement of dams was simulated by using the finite difference method with FLAC3D software, and then the modeling results were compared with the reading IS instrument. In the end, the caliber of the model and validate the results, by using regression analysis techniques and scrutinized modeling parameters with real situations and then by using MATLAB software and CURVE FITTING toolbox, new criteria for the settlement based on elasticity modulus, cohesion, friction angle, the density of earth dam and the alluvial foundation was obtained. The results of these studies show that, by using the new criteria measures, the amount of settlement and deformation for the dams with alluvial foundation can be corrected after instrument readings, and the error rate in reading IS instrument can be greatly reduced.

Keywords: earth-fill dam, foundation, settlement, finite difference, MATLAB, curve fitting

Procedia PDF Downloads 198
4897 Window Analysis and Malmquist Index for Assessing Efficiency and Productivity Growth in a Pharmaceutical Industry

Authors: Abbas Al-Refaie, Ruba Najdawi, Nour Bata, Mohammad D. AL-Tahat

Abstract:

The pharmaceutical industry is an important component of health care systems throughout the world. Measurement of a production unit-performance is crucial in determining whether it has achieved its objectives or not. This paper applies data envelopment (DEA) window analysis to assess the efficiencies of two packaging lines; Allfill (new) and DP6, in the Penicillin plant in a Jordanian Medical Company in 2010. The CCR and BCC models are used to estimate the technical efficiency, pure technical efficiency, and scale efficiency. Further, the Malmquist productivity index is computed to measure then employed to assess productivity growth relative to a reference technology. Two primary issues are addressed in computation of Malmquist indices of productivity growth. The first issue is the measurement of productivity change over the period, while the second is to decompose changes in productivity into what are generally referred to as a ‘catching-up’ effect (efficiency change) and a ‘frontier shift’ effect (technological change). Results showed that DP6 line outperforms the Allfill in technical and pure technical efficiency. However, the Allfill line outperforms DP6 line in scale efficiency. The obtained efficiency values can guide production managers in taking effective decisions related to operation, management, and plant size. Moreover, both machines exhibit a clear fluctuations in technological change, which is the main reason for the positive total factor productivity change. That is, installing a new Allfill production line can be of great benefit to increasing productivity. In conclusions, the DEA window analysis combined with the Malmquist index are supportive measures in assessing efficiency and productivity in pharmaceutical industry.

Keywords: window analysis, malmquist index, efficiency, productivity

Procedia PDF Downloads 610
4896 A Methodology for Optimisation of Water Containment Systems

Authors: Amir Hedjripour

Abstract:

The required dewatering configuration for a contaminated sediment dam is discussed to meet no-spill criteria for a defined Average Recurrence Interval (ARI). There is an option for the sediment dam to pump the contaminated water to another storage facility before its capacity is exceeded. The system is subjected to a range of storm durations belonging to the design ARI with concurrent dewatering to the other storage facility. The model is set up in 1-minute time intervals and temporal patterns of storm events are used to de-segregate the total storm depth into partial durations. By running the model for selected storm durations, the maximum water volume in the dam is recorded as the critical volume, which indicates the required storage capacity for that storm duration. Runoff from upstream catchment and the direct rainfall over the dam open area are calculated by taking into account the time of concentration for the catchment. Total 99 different storm durations from 5 minutes to 72 hours were modelled together with five dewatering scenarios from 50 l/s to 500 l/s. The optimised dam/pump configuration is selected by plotting critical points for all cases and storage-dewatering envelopes. A simple economic analysis is also presented in the paper using Present-Value (PV) analysis to assist with the financial evaluation of each configuration and selection of the best alternative.

Keywords: contaminated water, optimisation, pump, sediment dam

Procedia PDF Downloads 370
4895 Leveraging SHAP Values for Effective Feature Selection in Peptide Identification

Authors: Sharon Li, Zhonghang Xia

Abstract:

Post-database search is an essential phase in peptide identification using tandem mass spectrometry (MS/MS) to refine peptide-spectrum matches (PSMs) produced by database search engines. These engines frequently face difficulty differentiating between correct and incorrect peptide assignments. Despite advances in statistical and machine learning methods aimed at improving the accuracy of peptide identification, challenges remain in selecting critical features for these models. In this study, two machine learning models—a random forest tree and a support vector machine—were applied to three datasets to enhance PSMs. SHAP values were utilized to determine the significance of each feature within the models. The experimental results indicate that the random forest model consistently outperformed the SVM across all datasets. Further analysis of SHAP values revealed that the importance of features varies depending on the dataset, indicating that a feature's role in model predictions can differ significantly. This variability in feature selection can lead to substantial differences in model performance, with false discovery rate (FDR) differences exceeding 50% between different feature combinations. Through SHAP value analysis, the most effective feature combinations were identified, significantly enhancing model performance.

Keywords: peptide identification, SHAP value, feature selection, random forest tree, support vector machine

Procedia PDF Downloads 30
4894 Content-Based Image Retrieval Using HSV Color Space Features

Authors: Hamed Qazanfari, Hamid Hassanpour, Kazem Qazanfari

Abstract:

In this paper, a method is provided for content-based image retrieval. Content-based image retrieval system searches query an image based on its visual content in an image database to retrieve similar images. In this paper, with the aim of simulating the human visual system sensitivity to image's edges and color features, the concept of color difference histogram (CDH) is used. CDH includes the perceptually color difference between two neighboring pixels with regard to colors and edge orientations. Since the HSV color space is close to the human visual system, the CDH is calculated in this color space. In addition, to improve the color features, the color histogram in HSV color space is also used as a feature. Among the extracted features, efficient features are selected using entropy and correlation criteria. The final features extract the content of images most efficiently. The proposed method has been evaluated on three standard databases Corel 5k, Corel 10k and UKBench. Experimental results show that the accuracy of the proposed image retrieval method is significantly improved compared to the recently developed methods.

Keywords: content-based image retrieval, color difference histogram, efficient features selection, entropy, correlation

Procedia PDF Downloads 250
4893 An Integrated Intuitionistic Fuzzy Elimination Et Choix Traduisant La REalite (IFELECTRE) Model

Authors: Babak Daneshvar Rouyendegh

Abstract:

The aim of this study is to develop and describe a new methodology for the Multi-Criteria Decision-Making (MCDM) problem using Intuitionistic Fuzzy Elimination Et Choix Traduisant La REalite (IFELECTRE) model. The proposed models enable Decision-Makers (DMs) on the assessment and use Intuitionistic Fuzzy numbers (IFN). A numerical example is provided to demonstrate and clarify the proposed analysis procedure. Also, an empirical experiment is conducted to validation the effectiveness.

Keywords: Decision-Makers (DMs), Multi-Criteria Decision-Making (MCDM), Intuitionistic Fuzzy Elimination Et Choix Traduisant La REalite (IFELECTRE), Intuitionistic Fuzzy Numbers (IFN)

Procedia PDF Downloads 679
4892 Comparison of Urban Regeneration Strategies in Asia and the Development of Neighbourhood Regeneration in Malaysia

Authors: Wan Jiun Tin

Abstract:

Neighborhood regeneration has gained its popularity despite market-led urban redevelopment is still the main strategy in most of the countries in Asia. Area-based approach of neighborhood regeneration with the focus on people, place and system which covers the main sustainable aspects shall be studied as part of the solution. Project implementation in small scale without fully depending on the financial support from the government and main stakeholders is the advantage of neighborhood regeneration. This enables the improving and upgrading of living conditions to be ongoing even during the economy downturn. In addition to that, there will be no specific selection on the development areas as the entire nation share the similar opportunity to upgrade and to improve their neighborhood. This is important to narrow the income disparities in urban. The objective of this paper is to review and to summarize the urban regeneration in developed countries with the focus on Korea, Singapore and Hong Kong. The aim is to determine the direction of sustainable urban regeneration in Malaysia for post-Vision 2020 through the introduction of neighborhood regeneration. This paper is conducted via literature review and observations in those selected countries. In conclusion, neighborhood regeneration shall be one of the approach of sustainable urban regeneration in Malaysia. A few criteria have been identified and to be recommended for the adaptation in Malaysia.

Keywords: area-based regeneration, public participation, sustainable urban regeneration, urban redevelopment

Procedia PDF Downloads 277
4891 Service Business Model Canvas: A Boundary Object Operating as a Business Development Tool

Authors: Taru Hakanen, Mervi Murtonen

Abstract:

This study aims to increase understanding of the transition of business models in servitization. The significance of service in all business has increased dramatically during the past decades. Service-dominant logic (SDL) describes this change in the economy and questions the goods-dominant logic on which business has primarily been based in the past. A business model canvas is one of the most cited and used tools in defining end developing business models. The starting point of this paper lies in the notion that the traditional business model canvas is inherently goods-oriented and best suits for product-based business. However, the basic differences between goods and services necessitate changes in business model representations when proceeding in servitization. Therefore, new knowledge is needed on how the conception of business model and the business model canvas as its representation should be altered in servitized firms in order to better serve business developers and inter-firm co-creation. That is to say, compared to products, services are intangible and they are co-produced between the supplier and the customer. Value is always co-created in interaction between a supplier and a customer, and customer experience primarily depends on how well the interaction succeeds between the actors. The role of service experience is even stronger in service business compared to product business, as services are co-produced with the customer. This paper provides business model developers with a service business model canvas, which takes into account the intangible, interactive, and relational nature of service. The study employs a design science approach that contributes to theory development via design artifacts. This study utilizes qualitative data gathered in workshops with ten companies from various industries. In particular, key differences between Goods-dominant logic (GDL) and SDL-based business models are identified when an industrial firm proceeds in servitization. As the result of the study, an updated version of the business model canvas is provided based on service-dominant logic. The service business model canvas ensures a stronger customer focus and includes aspects salient for services, such as interaction between companies, service co-production, and customer experience. It can be used for the analysis and development of a current service business model of a company or for designing a new business model. It facilitates customer-focused new service design and service development. It aids in the identification of development needs, and facilitates the creation of a common view of the business model. Therefore, the service business model canvas can be regarded as a boundary object, which facilitates the creation of a common understanding of the business model between several actors involved. The study contributes to the business model and service business development disciplines by providing a managerial tool for practitioners in service development. It also provides research insight into how servitization challenges companies’ business models.

Keywords: boundary object, business model canvas, managerial tool, service-dominant logic

Procedia PDF Downloads 369
4890 Satisfaction Level of Teachers on the Human Resource Management Practices

Authors: Mark Anthony A. Catiil

Abstract:

Teachers are the principal actors in the delivery of quality education to the learners. Unfortunately, as time goes by, some of them got low motivation at work. Absenteeism, tardiness, under time, and non-compliance to school policies are some of the end results. There is, therefore, a need to review the different human resource management practices of the school that contribute to teachers’ work satisfaction and motivation. Hence, this study determined the level of satisfaction of teachers on the human resource management practices of Gingoog City Comprehensive National High School. This mixed-methodology research was focused on the 45 teachers chosen using a stratified random sampling technique. Reliability-tested questionnaires, interviews, and focus group discussions were used to gather the data. Results revealed that the majority of the respondents are female, Teacher I, with MA units and have served for 11-20 years. Likewise, among the human resource management practices of the school, the respondents rated the lowest satisfaction on recruitment and selection (mean=2.15; n=45). This could mean that most of the recruitment and selection practices of the school are not well communicated, disseminated, and implemented. On the other hand, retirement practices of the school were rated with the highest satisfaction among the respondents (mean=2.73; n=45). This could mean that most of the retirement practices of the school are communicated, disseminated, implemented, and functional. It was recommended that the existing human resource management practices on recruitment and selection be reviewed to find out its deficiencies and possible improvement. Moreover, future researchers may also conduct a study between private and public schools in Gingoog City on the same topic for comparison.

Keywords: education, human resource management practices, satisfaction, teachers

Procedia PDF Downloads 130
4889 The Application of Modern Technologies in Urban Development

Authors: Solotan A. Tolulope

Abstract:

Due to the lack of application of laws, implementers' acquaintance with the principles of urban planning, or the absence of laws and the governmental role, cities and their urban growth developed more than the fundamental designs and plans. This has led to a lack of foundations and criteria for achieving a life that provides the needs of sufficient housing in urban planning. In this study, we attempted to use cutting-edge innovations and technology to manage and resolve issues while collaborating with planning cadres that have the potential to significantly and favorably impact urban development. This helps to enhance management's function and the effectiveness of urban planning and management. To fulfill the needs of the community and the neighborhoods of these cities, modern approaches and technologies are used, addressing the criteria of sustainability and development. To put the notion of urban sustainability and development into action, this has been researched using global experiences.

Keywords: application, modern, technologies, urban, development

Procedia PDF Downloads 112
4888 Growth Curves Genetic Analysis of Native South Caspian Sea Poultry Using Bayesian Statistics

Authors: Jamal Fayazi, Farhad Anoosheh, Mohammad R. Ghorbani, Ali R. Paydar

Abstract:

In this study, to determine the best non-linear regression model describing the growth curve of native poultry, 9657 chicks of generations 18, 19, and 20 raised in Mazandaran breeding center were used. Fowls and roosters of this center distributed in south of Caspian Sea region. To estimate the genetic variability of none linear regression parameter of growth traits, a Gibbs sampling of Bayesian analysis was used. The average body weight traits in the first day (BW1), eighth week (BW8) and twelfth week (BW12) were respectively estimated as 36.05, 763.03, and 1194.98 grams. Based on the coefficient of determination, mean squares of error and Akaike information criteria, Gompertz model was selected as the best growth descriptive function. In Gompertz model, the estimated values for the parameters of maturity weight (A), integration constant (B) and maturity rate (K) were estimated to be 1734.4, 3.986, and 0.282, respectively. The direct heritability of BW1, BW8 and BW12 were respectively reported to be as 0.378, 0.3709, 0.316, 0.389, 0.43, 0.09 and 0.07. With regard to estimated parameters, the results of this study indicated that there is a possibility to improve some property of growth curve using appropriate selection programs.

Keywords: direct heritability, Gompertz, growth traits, maturity weight, native poultry

Procedia PDF Downloads 266
4887 A Goal-Oriented Approach for Supporting Input/Output Factor Determination in the Regulation of Brazilian Electricity Transmission

Authors: Bruno de Almeida Vilela, Heinz Ahn, Ana Lúcia Miranda Lopes, Marcelo Azevedo Costa

Abstract:

Benchmarking public utilities such as transmission system operators (TSOs) is one of the main strategies employed by regulators in order to fix monopolistic companies’ revenues. Since 2007 the Brazilian regulator has been utilizing Data Envelopment Analysis (DEA) to benchmark TSOs. Despite the application of DEA to improve the transmission sector’s efficiency, some problems can be pointed out, such as the high price of electricity in Brazil; the limitation of the benchmarking only to operational expenses (OPEX); the absence of variables that represent the outcomes of the transmission service; and the presence of extremely low and high efficiencies. As an alternative to the current concept of benchmarking the Brazilian regulator uses, we propose a goal-oriented approach. Our proposal supports input/output selection by taking traditional organizational goals and measures as a basis for the selection of factors for benchmarking purposes. As the main advantage, it resolves the classical DEA problems of input/output selection, undesirable and dual-role factors. We also provide a demonstration of our goal-oriented concept regarding service quality. As a result, most TSOs’ efficiencies in Brazil might improve when considering quality as important in their efficiency estimation.

Keywords: decision making, goal-oriented benchmarking, input/output factor determination, TSO regulation

Procedia PDF Downloads 197
4886 Microscopic Simulation of Toll Plaza Safety and Operations

Authors: Bekir O. Bartin, Kaan Ozbay, Sandeep Mudigonda, Hong Yang

Abstract:

The use of microscopic traffic simulation in evaluating the operational and safety conditions at toll plazas is demonstrated. Two toll plazas in New Jersey are selected as case studies and were developed and validated in Paramics traffic simulation software. In order to simulate drivers’ lane selection behavior in Paramics, a utility-based lane selection approach is implemented in Paramics Application Programming Interface (API). For each vehicle approaching the toll plaza, a utility value is assigned to each toll lane by taking into account the factors that are likely to impact drivers’ lane selection behavior, such as approach lane, exit lane and queue lengths. The results demonstrate that similar operational conditions, such as lane-by-lane toll plaza traffic volume can be attained using this approach. In addition, assessment of safety at toll plazas is conducted via a surrogate safety measure. In particular, the crash index (CI), an improved surrogate measure of time-to-collision (TTC), which reflects the severity of a crash is used in the simulation analyses. The results indicate that the spatial and temporal frequency of observed crashes can be simulated using the proposed methodology. Further analyses can be conducted to evaluate and compare various different operational decisions and safety measures using microscopic simulation models.

Keywords: microscopic simulation, toll plaza, surrogate safety, application programming interface

Procedia PDF Downloads 183
4885 Breast Cancer Survivability Prediction via Classifier Ensemble

Authors: Mohamed Al-Badrashiny, Abdelghani Bellaachia

Abstract:

This paper presents a classifier ensemble approach for predicting the survivability of the breast cancer patients using the latest database version of the Surveillance, Epidemiology, and End Results (SEER) Program of the National Cancer Institute. The system consists of two main components; features selection and classifier ensemble components. The features selection component divides the features in SEER database into four groups. After that it tries to find the most important features among the four groups that maximizes the weighted average F-score of a certain classification algorithm. The ensemble component uses three different classifiers, each of which models different set of features from SEER through the features selection module. On top of them, another classifier is used to give the final decision based on the output decisions and confidence scores from each of the underlying classifiers. Different classification algorithms have been examined; the best setup found is by using the decision tree, Bayesian network, and Na¨ıve Bayes algorithms for the underlying classifiers and Na¨ıve Bayes for the classifier ensemble step. The system outperforms all published systems to date when evaluated against the exact same data of SEER (period of 1973-2002). It gives 87.39% weighted average F-score compared to 85.82% and 81.34% of the other published systems. By increasing the data size to cover the whole database (period of 1973-2014), the overall weighted average F-score jumps to 92.4% on the held out unseen test set.

Keywords: classifier ensemble, breast cancer survivability, data mining, SEER

Procedia PDF Downloads 329
4884 The Eye Tracking Technique and the Study of Some Abstract Mathematical Concepts at the University

Authors: Tamara Díaz-Chang, Elizabeth-H Arredondo

Abstract:

This article presents the results of mixed approach research, where the ocular movements of students are examined while they solve questionnaires related to some abstract mathematical concepts. The objective of this research is to determine possible correlations between the parameters of ocular activity and the level of difficulty of the tasks. The difficulty level categories were established based on two types of criteria: a subjective one, through an evaluation, carried out by the subjects, and a behavioral one, related to obtaining the correct solution. Correlations of these criteria with ocular activity parameters, which were considered indicators of mental effort, were identified. The analysis of the data obtained allowed us to observe discrepancies in the categorization of difficulty levels based on subjective and behavioral criteria. There was a negative correlation of the eye movement parameters with the students' opinions on the level of difficulty of the questions, while a strong positive and significant correlation was noted between most of the parameters of ocular activity and the level of difficulty, determined by the percentage of correct answers. The results obtained by the analysis of the data suggest that eye movement parameters can be taken as indicators of the difficulty level of the tasks related to the study of some abstract mathematical concepts at the university.

Keywords: abstract mathematical concepts, cognitive neuroscience, eye-tracking, university education

Procedia PDF Downloads 120
4883 A Multi-Criteria Decision Making Approach for Disassembly-To-Order Systems under Uncertainty

Authors: Ammar Y. Alqahtani

Abstract:

In order to minimize the negative impact on the environment, it is essential to manage the waste that generated from the premature disposal of end-of-life (EOL) products properly. Consequently, government and international organizations introduced new policies and regulations to minimize the amount of waste being sent to landfills. Moreover, the consumers’ awareness regards environment has forced original equipment manufacturers to consider being more environmentally conscious. Therefore, manufacturers have thought of different ways to deal with waste generated from EOL products viz., remanufacturing, reusing, recycling, or disposing of EOL products. The rate of depletion of virgin natural resources and their dependency on the natural resources can be reduced by manufacturers when EOL products are treated as remanufactured, reused, or recycled, as well as this will cut on the amount of harmful waste sent to landfills. However, disposal of EOL products contributes to the problem and therefore is used as a last option. Number of EOL need to be estimated in order to fulfill the components demand. Then, disassembly process needs to be performed to extract individual components and subassemblies. Smart products, built with sensors embedded and network connectivity to enable the collection and exchange of data, utilize sensors that are implanted into products during production. These sensors are used for remanufacturers to predict an optimal warranty policy and time period that should be offered to customers who purchase remanufactured components and products. Sensor-provided data can help to evaluate the overall condition of a product, as well as the remaining lives of product components, prior to perform a disassembly process. In this paper, a multi-period disassembly-to-order (DTO) model is developed that takes into consideration the different system uncertainties. The DTO model is solved using Nonlinear Programming (NLP) in multiple periods. A DTO system is considered where a variety of EOL products are purchased for disassembly. The model’s main objective is to determine the best combination of EOL products to be purchased from every supplier in each period which maximized the total profit of the system while satisfying the demand. This paper also addressed the impact of sensor embedded products on the cost of warranties. Lastly, this paper presented and analyzed a case study involving various simulation conditions to illustrate the applicability of the model.

Keywords: closed-loop supply chains, environmentally conscious manufacturing, product recovery, reverse logistics

Procedia PDF Downloads 138
4882 An Echo of Eco: Investigating the Effectiveness of Eco-Friendly Advertising Media of Fashion Brand Communication

Authors: Vaishali Joshi

Abstract:

In the past, companies and buyers operated as if there was infinite availability of natural resources for usage, which has resulted in the loss of our globe's natural ecosystem. People's consciousness of ecological concerns had increased, which showed the way for the evolution of the green revolution with the objective of discontinuing the use of products that are harmful to the ecosystem of the earth. This green revolution has made the consumers head toward those companies which are providing eco-friendly products s/service s through less eco-harmful ways. Studies show that companies started gaining a reputation in the market through their eco-friendly activities in their business. Hence companies should be alert to understand the consumer's environmentally friendly consumption behavior to survive and be in the game of the competition. Green marketing efforts guarantee beneficial exchanges without harmful consequences for current and /or upcoming generations. This hits the green policies of those companies which are claiming environmental concern. This means that these companies not only focus on the impact of their production and products on the ecosystem but also on every small activity in their value chain. One of the most ignored parts of the value chain is the medium through which the marketing of products/services is done. These companies should also take into account to what degree their selection of advertising media affects the ecosystem of the earth. In this study, a hypothetical fashion apparel brand known as "Dolphin" will be studied. In particular, the following objectives are framed: i) to study the brand attitude of the given fashion brand due to its selection of eco-friendly advertising medium ii) to study the advertisement attitude of the given fashion brand due to its selection of eco-friendly advertising medium and iii) to study the purchase intention of the given fashion brand due to its selection of eco-friendly advertising medium. An online experiment will be conducted. Respondents between the ages of 20-and 64 years will be selected randomly from the online consumer panel database. The findings of this study will have a great impact on the companies that are claiming environmental concerns by understanding how the advertising media is affecting the company’s brand image in the long run.

Keywords: eco-friendly advertising media, fashion, attitude, purchase intention

Procedia PDF Downloads 100
4881 Can Exams Be Shortened? Using a New Empirical Approach to Test in Finance Courses

Authors: Eric S. Lee, Connie Bygrave, Jordan Mahar, Naina Garg, Suzanne Cottreau

Abstract:

Marking exams is universally detested by lecturers. Final exams in many higher education courses often last 3.0 hrs. Do exams really need to be so long? Can we justifiably reduce the number of questions on them? Surprisingly few have researched these questions, arguably because of the complexity and difficulty of using traditional methods. To answer these questions empirically, we used a new approach based on three key elements: Use of an unusual variation of a true experimental design, equivalence hypothesis testing, and an expanded set of six psychometric criteria to be met by any shortened exam if it is to replace a current 3.0-hr exam (reliability, validity, justifiability, number of exam questions, correspondence, and equivalence). We compared student performance on each official 3.0-hr exam with that on five shortened exams having proportionately fewer questions (2.5, 2.0, 1.5, 1.0, and 0.5 hours) in a series of four experiments conducted in two classes in each of two finance courses (224 students in total). We found strong evidence that, in these courses, shortening of final exams to 2.0 hrs was warranted on all six psychometric criteria. Shortening these exams by one hour should result in a substantial one-third reduction in lecturer time and effort spent marking, lower student stress, and more time for students to prepare for other exams. Our approach provides a relatively simple, easy-to-use methodology that lecturers can use to examine the effect of shortening their own exams.

Keywords: exam length, psychometric criteria, synthetic experimental designs, test length

Procedia PDF Downloads 272
4880 Removal of Acetaminophen with Chitosan-Nano Activated Carbon Beads from Aqueous Sources

Authors: Parisa Amouzgar, Chan Eng Seng, Babak Salamatinia

Abstract:

Pharmaceutical products are being increasingly detected in the environment. However, conventional treatment systems do not provide an adequate treatment for pharmaceutical drug elimination and still there is not a regulated standard for their limitation in water. Since decades before, pharmaceuticals have been in the water but only recently, their levels in the environment have been recognized and quantified as potentially hazardous to ecosystems. In this study chitosan with a bio-based NAC (Ct-NAC) were made as beads with extrusion dripping method and investigated for acetaminophen removal from water. The effects of beading parameters such as flow rate in dripping, the distance from dipping tip to the solution surface, concentration of chitosan and percentage of NAC were analyzed to find the optimum condition. Based on the results, the overall adsorption rate and removal efficiency increased during the time till the equilibrium rate which was 80% removal of acetaminophen. The maximum adsorption belonged to the beads with 1.75% chitosan, 60% NAC, flow-rate of 1.5 ml/min while the distance of dripping was 22.5 cm.

Keywords: pharmaceuticals, water treatment, chitosan nano activated carbon beads, Acetaminophen

Procedia PDF Downloads 357
4879 Aspects of the Detail Design of an Automated Biomethane Test

Authors: Ilias Katsanis, Paraskevas Papanikos, Nikolas Zacharopoulos, Vassilis C. Moulianitis, Evgenios Scourboutis, Diamantis T. Panagiotarakos

Abstract:

This paper presents aspects of the detailed design of an automated biomethane potential measurement system using CAD techniques. First, the design specifications grouped in eight sets that are used to design the design alternatives are briefly presented. Then, the major components of the final concept, as well as the design of the test, are presented. The material selection process is made using ANSYS EduPack database software. The mechanical behavior of one component developed in Creo v.5 is evaluated using finite element analysis. Finally, aspects of software development that integrate the BMP test is finally presented. This paper shows the advantages of CAD techniques in product design applied in the design of a mechatronic product.

Keywords: automated biomethane test, detail mechatronics design, materials selection, mechanical analysis

Procedia PDF Downloads 89
4878 The Application of Enzymes on Pharmaceutical Products and Process Development

Authors: Reginald Anyanwu

Abstract:

Enzymes are biological molecules that significantly regulate the rate of almost all of the chemical reactions that take place within cells, and have been widely used for products’ innovations. They are vital for life and serve a wide range of important functions in the body, such as aiding in digestion and metabolism. The present study was aimed at finding out the extent to which biological molecules have been utilized by pharmaceutical, food and beverage, and biofuel industries in commercial and scale up applications. Taking into account the escalating business opportunities in this vertical, biotech firms have also been penetrating enzymes industry especially that of food. The aim of the study therefore was to find out how biocatalysis can be successfully deployed; how enzyme application can improve industrial processes. To achieve the purpose of the study, the researcher focused on the analytical tools that are critical for the scale up implementation of enzyme immobilization to ascertain the extent of increased product yield at minimum logistical burden and maximum market profitability on the environment and user. The researcher collected data from four pharmaceutical companies located at Anambra state and Imo state of Nigeria. Questionnaire items were distributed to these companies. The researcher equally made a personal observation on the applicability of these biological molecules on innovative Products since there is now shifting trends toward the consumption of healthy and quality food. In conclusion, it was discovered that enzymes have been widely used for products’ innovations but there are however variations on their applications. It was also found out that pivotal contenders of enzymes market have lately been making heavy investments in the development of innovative product solutions. It was recommended that the applications of enzymes on innovative products should be widely practiced.

Keywords: enzymes, pharmaceuticals, process development, quality food consumption, scale-up applications

Procedia PDF Downloads 142
4877 Measuring the Level of Knowledge of Construction Contracts Procedures: A Case Study of Botswana

Authors: Babulayi B. Wilson

Abstract:

Unsatisfactory performance of construction projects in both the industrialised and developing countries indicate that there could be several defects in construction projects phases. Notwithstanding the fact that some project defects are often conceived at the initiation phase of construction projects, insufficient knowledge of contract procedures has been identified as one of the major sources of construction disputes. Contract procedures are a set of rules that outlines the primary obligations and liabilities of parties involved in the implementation of a construction project. Engineering professional bodies often codify contract procedures into standard forms of contract such as the Institution of Civil Engineers (ICE, UK) and Association of Consulting Engineers (ACE, UK) and keep them under constant review by updating any clause to reflect any change in case law or relevant piece of legislation. Even so, it is the responsibility of a professional body or conditions of contract draftsperson to introduce contract-specific clauses that may be necessary for business efficacy but not covered in the chosen standard conditions of contract. In Botswana, the use of clients’ drafted and/or un-adapted for environment of use international forms of contract in conjunction with client-drafted pricing schedules is common. The product of the latter often impact negatively upon contractors’ claims and payments, in that, tender rates and prices can only be deemed to be sufficient if the chosen conditions of contract compliment the pricing schedule (use of standardised procurement documents). In addition, client drafted and the use of borrowed forms of contract such as FIDIC often conflict with domicile law resulting in costly disputes on the part of the client. It is upon the preceding text that the object of the research is to measure the level of knowledge of contract procedures amongst key stakeholders in the Botswana construction industry by requesting a representative sample from the industry and academia to respond to tutorial questions prepared from two commonly used forms of contract for civil works, that is, FIDIC (International Form of Contract) and ICE (UK). The questions were prepared under the following captions: (a) preparation of tender documents (b) obligations of the parties (c) contract administration; and (d) claims, variations, and valuation of variations. After ascertaining that the level of knowledge of contract procedures is insufficient among most practitioners in the Botswana construction industry, major procurement entities, and engineering institutions of learning; a guide to drafting a condition of a construction contract was developed and then validated through seminars and workshops. In the present, the effectiveness of the guide is not yet measured but feedback from seminars and workshops conducted indicates an appreciation of the guide by the majority of major construction industry stakeholders.

Keywords: contract procedures, conditions of contract, professional practice, construction law, forms of contract

Procedia PDF Downloads 196
4876 Transition Pathways of Commercial-Urban Fleet Electrification

Authors: Emily Gould, Walter Wehremeyer, David Greaves, Rodney Turtle

Abstract:

This paper considers current thinking on the pathway for electric vehicles, identifying the development blocks of alternative innovation within the market and analyse technological lock-in. The relationship between transition pathways and technological lock-in is largely under-researched particularly in the field of e-mobility. This paper is based on a study with three commercial-urban fleets that examines strategic decisions in new technology adaption alongside vehicle procurement and driver perspective. The paper will analyse the fleet’s decision matrix upon electric vehicles and seek to understand the influence of company culture, strategy and technology applicability, within the context of transition pathways.

Keywords: electric vehicles, fleets, path dependencies, transition pathways

Procedia PDF Downloads 568
4875 Integrating of Multi-Criteria Decision Making and Spatial Data Warehouse in Geographic Information System

Authors: Zohra Mekranfar, Ahmed Saidi, Abdellah Mebrek

Abstract:

This work aims to develop multi-criteria decision making (MCDM) and spatial data warehouse (SDW) methods, which will be integrated into a GIS according to a ‘GIS dominant’ approach. The GIS operating tools will be operational to operate the SDW. The MCDM methods can provide many solutions to a set of problems with various and multiple criteria. When the problem is so complex, integrating spatial dimension, it makes sense to combine the MCDM process with other approaches like data mining, ascending analyses, we present in this paper an experiment showing a geo-decisional methodology of SWD construction, On-line analytical processing (OLAP) technology which combines both basic multidimensional analysis and the concepts of data mining provides powerful tools to highlight inductions and information not obvious by traditional tools. However, these OLAP tools become more complex in the presence of the spatial dimension. The integration of OLAP with a GIS is the future geographic and spatial information solution. GIS offers advanced functions for the acquisition, storage, analysis, and display of geographic information. However, their effectiveness for complex spatial analysis is questionable due to their determinism and their decisional rigor. A prerequisite for the implementation of any analysis or exploration of spatial data requires the construction and structuring of a spatial data warehouse (SDW). This SDW must be easily usable by the GIS and by the tools offered by an OLAP system.

Keywords: data warehouse, GIS, MCDM, SOLAP

Procedia PDF Downloads 178
4874 Effect of Low Level Laser Therapy versus Polarized Light Therapy on Oral Mucositis in Cancer Patients Receiving Chemotherapy

Authors: Andrew Anis Fakhrey Mosaad

Abstract:

The goal of this study is to compare the efficacy of polarised light therapy with low-intensity laser therapy in treating oral mucositis brought on by chemotherapy in cancer patients. Evaluation procedures are the measurement of the WHO oral mucositis scale and the Common toxicity criteria scale. Techniques: Cancer patients (men and women) who had oral mucositis, ulceration, and discomfort and whose ages varied from 30 to 55 years were separated into two groups and received 40 chemotherapy treatments. Twenty patients in Group (A) received low-level laser therapy (LLLT) along with their regular oral mucositis medication treatment, while twenty patients in Group (B) received Bioptron light therapy (BLT) along with their regular oral mucositis medication treatment. Both treatments were applied for 10 minutes each day for 30 days. Conclusion and results: This study showed that the use of both BLT and LLLT on oral mucositis in cancer patients following chemotherapy greatly improved, as seen by the sharp falls in both the WHO oral mucositis scale (OMS) and the common toxicity criteria scale (CTCS). However, low-intensity laser therapy (LLLT) was superior to Bioptron light therapy in terms of benefits (BLT).

Keywords: Bioptron light therapy, low level laser therapy, oral mucositis, WHO oral mucositis scale, common toxicity criteria scale

Procedia PDF Downloads 246