Search results for: decision quality.
2287 Incidence of Fungal Infections and Mycotoxicosis in Pork Meat and Pork By-Products in Egyptian Markets
Authors: Ashraf S. Hakim, Randa M. Alarousy
Abstract:
The consumption of food contaminated with molds (microscopic filamentous fungi) and their toxic metabolites results in the development of food-borne mycotoxicosis. The spores of molds are ubiquitously spread in the environment and can be detected everywhere. Ochratoxin A is a toxic and potentially carcinogenic fungal toxin found in a variety of food commodities. In this study, the mycological quality of various ready-to-eat local and imported pork meat and meat byproducts sold in Egyptian markets were assessed and the presence of various molds was determined in pork used as a raw material, edible organs as liver and kidney as well as in fermented raw meat by-products. The study assessed the mycological quality of pork raw meat and their by-products sold in commercial shops in Cairo, Egypt. Mycological analysis was conducted on (n=110) samples which included pig’s livers and kidneys from Egyptian Bassatin slaughter house; local and imported processed pork meat by-products from Egyptian pork markets. The isolates were identified using traditional mycological and biochemical tests. All kidney and liver samples were positive to molds growth while all byproducts were negative. Ochratoxin A levels were quantitatively analyzed using the high performance liquid chromatography (HPLC) and the highest results were present in kidney 7.51 part per billion (ppb) followed by minced meat 6.19 ppb generally the local samples showed higher levels than the imported ones. To the best of our knowledge, this is the first report on mycotoxins detection and quantification from pork by-products in Egypt.Keywords: Egypt, imported pork by-products, local, mycotoxins.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19682286 Evaluating and Selecting Optimization Software Packages: A Framework for Business Applications
Authors: Waleed Abohamad, Amr Arisha
Abstract:
Owing the fact that optimization of business process is a crucial requirement to navigate, survive and even thrive in today-s volatile business environment, this paper presents a framework for selecting a best-fit optimization package for solving complex business problems. Complexity level of the problem and/or using incorrect optimization software can lead to biased solutions of the optimization problem. Accordingly, the proposed framework identifies a number of relevant factors (e.g. decision variables, objective functions, and modeling approach) to be considered during the evaluation and selection process. Application domain, problem specifications, and available accredited optimization approaches are also to be regarded. A recommendation of one or two optimization software is the output of the framework which is believed to provide the best results of the underlying problem. In addition to a set of guidelines and recommendations on how managers can conduct an effective optimization exercise is discussed.Keywords: Complex Business Problems, Optimization, Selection Criteria, Software Evaluation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29102285 Decision Making about the Environmental Management Implementation – Incentives and Expectations
Authors: Eva Štěpánková
Abstract:
Environmental management implementation is presently one of the ways of organization success and value improvement. Increasing an organization motivation to environmental measures introduction is caused primarily by the rising pressure of the society that generates various incentives to endeavor for the environmental performance improvement. The aim of the paper is to identify and characterize the key incentives and expectations leading organizations to the environmental management implementation. The author focuses on five businesses of different size and field, operating in the Czech Republic. The qualitative approach and grounded theory procedure are used in research. The results point out that the significant incentives for environmental management implementation represent primarily demands of customers, the opportunity to declare the environmental commitment and image improvement. The researched enterprises less commonly expect the economical contribution, competitive advantage increase or export rate improvement. The results show that marketing contributions are primarily expected from the environmental management implementation.
Keywords: Environmental management, environmental management systems, ISO 14001.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25412284 Arriving at an Optimum Value of Tolerance Factor for Compressing Medical Images
Authors: Sumathi Poobal, G. Ravindran
Abstract:
Medical imaging uses the advantage of digital technology in imaging and teleradiology. In teleradiology systems large amount of data is acquired, stored and transmitted. A major technology that may help to solve the problems associated with the massive data storage and data transfer capacity is data compression and decompression. There are many methods of image compression available. They are classified as lossless and lossy compression methods. In lossy compression method the decompressed image contains some distortion. Fractal image compression (FIC) is a lossy compression method. In fractal image compression an image is coded as a set of contractive transformations in a complete metric space. The set of contractive transformations is guaranteed to produce an approximation to the original image. In this paper FIC is achieved by PIFS using quadtree partitioning. PIFS is applied on different images like , Ultrasound, CT Scan, Angiogram, X-ray, Mammograms. In each modality approximately twenty images are considered and the average values of compression ratio and PSNR values are arrived. In this method of fractal encoding, the parameter, tolerance factor Tmax, is varied from 1 to 10, keeping the other standard parameters constant. For all modalities of images the compression ratio and Peak Signal to Noise Ratio (PSNR) are computed and studied. The quality of the decompressed image is arrived by PSNR values. From the results it is observed that the compression ratio increases with the tolerance factor and mammogram has the highest compression ratio. The quality of the image is not degraded upto an optimum value of tolerance factor, Tmax, equal to 8, because of the properties of fractal compression.Keywords: Fractal image compression, IFS, PIFS, PSNR, Quadtree partitioning.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17402283 Tuberculosis Modelling Using Bio-PEPA Approach
Authors: Dalila Hamami, Baghdad Atmani
Abstract:
Modelling is a widely used tool to facilitate the evaluation of disease management. The interest of epidemiological models lies in their ability to explore hypothetical scenarios and provide decision makers with evidence to anticipate the consequences of disease incursion and impact of intervention strategies.
All models are, by nature, simplification of more complex systems. Models that involve diseases can be classified into different categories depending on how they treat the variability, time, space, and structure of the population. Approaches may be different from simple deterministic mathematical models, to complex stochastic simulations spatially explicit.
Thus, epidemiological modelling is now a necessity for epidemiological investigations, surveillance, testing hypotheses and generating follow-up activities necessary to perform complete and appropriate analysis.
The state of the art presented in the following, allows us to position itself to the most appropriate approaches in the epidemiological study.
Keywords: Bio-PEPA, Cellular automata, Epidemiological modelling, multi agent system, ordinary differential equations, PEPA, Process Algebra, Tuberculosis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21582282 Effective Dose and Size Specific Dose Estimation with and without Tube Current Modulation for Thoracic Computed Tomography Examinations: A Phantom Study
Authors: S. Gharbi, S. Labidi, M. Mars, M. Chelli, F. Ladeb
Abstract:
The purpose of this study is to reduce radiation dose for chest CT examination by including Tube Current Modulation (TCM) to a standard CT protocol. A scan of an anthropomorphic male Alderson phantom was performed on a 128-slice scanner. The estimation of effective dose (ED) in both scans with and without mAs modulation was done via multiplication of Dose Length Product (DLP) to a conversion factor. Results were compared to those measured with a CT-Expo software. The size specific dose estimation (SSDE) values were obtained by multiplication of the volume CT dose index (CTDIvol) with a conversion size factor related to the phantom’s effective diameter. Objective assessment of image quality was performed with Signal to Noise Ratio (SNR) measurements in phantom. SPSS software was used for data analysis. Results showed including CARE Dose 4D; ED was lowered by 48.35% and 51.51% using DLP and CT-expo, respectively. In addition, ED ranges between 7.01 mSv and 6.6 mSv in case of standard protocol, while it ranges between 3.62 mSv and 3.2 mSv with TCM. Similar results are found for SSDE; dose was higher without TCM of 16.25 mGy and was lower by 48.8% including TCM. The SNR values calculated were significantly different (p=0.03<0.05). The highest one is measured on images acquired with TCM and reconstructed with Filtered back projection (FBP). In conclusion, this study proves the potential of TCM technique in SSDE and ED reduction and in conserving image quality with high diagnostic reference level for thoracic CT examinations.
Keywords: Anthropomorphic phantom, computed tomography, CT-expo, radiation dose.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14642281 Expanding Affordable Housing through Inclusionary Zoning in the City of Toronto
Authors: Sam Moshaver
Abstract:
Reasonably priced and well-constructed housing must be an integral and element supporting a healthy society. The absence of housing everyone in society can afford negatively affects the people's health, education, ability to get jobs, develop their community. Without access to decent housing, economic development, integration of immigrants and inclusiveness, the society is negatively impacted. Canada has a sterling record in creating housing compared to many other nations around the globe. Canadian housing gets support from a mature and responsive mortgage network and a top-quality construction industry as well as safe and excellent quality building materials that are readily available. Yet 1.7 million Canadian households occupy substandard abodes. During the past hundred years, Canada's government has made a wide variety of attempts to provide decent residential facilities every Canadian can afford. Despite these laudable efforts, today Canada is left with housing that is inadequate for many Canadians. People who own their housing are given all kinds of privileges and perks, while people with relatively low incomes who rent their apartments or houses are discriminated against. To help solve these problems, zoning that is based on an "inclusionary" philosophy is tool developed to help provide people the affordable residences that they need. No, thirty years after its introduction, this type of zoning has been shown effective in helping build and provide Canadians with a houses or apartments they can afford to pay for. Using this form of zoning can have different results +depending on where and how it is used. After examining Canadian affordable housing and four American cases where this type of zoning was enforced in the USA, this makes various recommendations for expanding Canadians' access to housing they can afford.Keywords: Affordable Housing, Inclusionary Zoning Low- Income Housing, Toronto Housing.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20662280 Historical Landscape Affects Present Tree Density in Paddy Field
Authors: Ha T. Pham, Shuichi Miyagawa
Abstract:
Ongoing landscape transformation is one of the major causes behind disappearance of traditional landscapes, and lead to species and resource loss. Tree in paddy fields in the northeast of Thailand is one of those traditional landscapes. Using three different historical time layers, we acknowledged the severe deforestation and rapid urbanization happened in the region. Despite the general thinking of decline in tree density as consequences, the heterogeneous trend of changes in total tree density in three studied landscapes denied the hypothesis that number of trees in paddy field depend on the length of land use practice. On the other hand, due to selection of planting new trees on levees, existence of trees in paddy field now relies on their values for human use. Besides, changes in land use and landscape structure had a significant impact on decision of which tree density level is considered as suitable for the landscape.
Keywords: Aerial photographs, land use change, traditional landscape, tree in paddy fields.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18642279 Trust Building Mechanisms for Electronic Business Networks and Their Relation to eSkills
Authors: Radoslav Delina, Michal Tkáč
Abstract:
Globalization, supported by information and communication technologies, changes the rules of competitiveness and increases the significance of information, knowledge and network cooperation. In line with this trend, the need for efficient trust-building tools has emerged. The absence of trust building mechanisms and strategies was identified within several studies. Through trust development, participation on e-business network and usage of network services will increase and provide to SMEs new economic benefits. This work is focused on effective trust building strategies development for electronic business network platforms. Based on trust building mechanism identification, the questionnairebased analysis of its significance and minimum level of requirements was conducted. In the paper, we are confirming the trust dependency on e-Skills which play crucial role in higher level of trust into the more sophisticated and complex trust building ICT solutions.Keywords: Correlation analysis, decision trees, e-marketplace, trust building
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19272278 Fuzzy C-Means Clustering Algorithm for Voltage Stability in Large Power Systems
Authors: Mohamad R. Khaldi, Christine S. Khoury, Guy M. Naim
Abstract:
The steady-state operation of maintaining voltage stability is done by switching various controllers scattered all over the power network. When a contingency occurs, whether forced or unforced, the dispatcher is to alleviate the problem in a minimum time, cost, and effort. Persistent problem may lead to blackout. The dispatcher is to have the appropriate switching of controllers in terms of type, location, and size to remove the contingency and maintain voltage stability. Wrong switching may worsen the problem and that may lead to blackout. This work proposed and used a Fuzzy CMeans Clustering (FCMC) to assist the dispatcher in the decision making. The FCMC is used in the static voltage stability to map instantaneously a contingency to a set of controllers where the types, locations, and amount of switching are induced.Keywords: Fuzzy logic, Power system control, Reactive power control, Voltage control
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18852277 Application of Spreadsheet and Queuing Network Model to Capacity Optimization in Product Development
Authors: Muhammad Marsudi, Dzuraidah Abdul Wahab, Che Hassan Che Haron
Abstract:
Modeling of a manufacturing system enables one to identify the effects of key design parameters on the system performance and as a result to make correct decision. This paper proposes a manufacturing system modeling approach using a spreadsheet model based on queuing network theory, in which a static capacity planning model and stochastic queuing model are integrated. The model was used to improve the existing system utilization in relation to product design. The model incorporates few parameters such as utilization, cycle time, throughput, and batch size. The study also showed that the validity of developed model is good enough to apply and the maximum value of relative error is 10%, far below the limit value 32%. Therefore, the model developed in this study is a valuable alternative model in evaluating a manufacturing systemKeywords: Manufacturing system, product design, spreadsheet model, utilization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19202276 Scheduling a Flexible Flow Shops Problem using DEA
Authors: Fatemeh Dadkhah, Hossein Ali Akbarpour
Abstract:
This paper considers a scheduling problem in flexible flow shops environment with the aim of minimizing two important criteria including makespan and cumulative tardiness of jobs. Since the proposed problem is known as an Np-hard problem in literature, we have to develop a meta-heuristic to solve it. We considered general structure of Genetic Algorithm (GA) and developed a new version of that based on Data Envelopment Analysis (DEA). Two objective functions assumed as two different inputs for each Decision Making Unit (DMU). In this paper we focused on efficiency score of DMUs and efficient frontier concept in DEA technique. After introducing the method we defined two different scenarios with considering two types of mutation operator. Also we provided an experimental design with some computational results to show the performance of algorithm. The results show that the algorithm implements in a reasonable time.Keywords: Data envelopment analysis, Efficiency, Flexible flow shops, Genetic algorithm
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18142275 Time Series Forecasting Using Various Deep Learning Models
Authors: Jimeng Shi, Mahek Jain, Giri Narasimhan
Abstract:
Time Series Forecasting (TSF) is used to predict the target variables at a future time point based on the learning from previous time points. To keep the problem tractable, learning methods use data from a fixed length window in the past as an explicit input. In this paper, we study how the performance of predictive models change as a function of different look-back window sizes and different amounts of time to predict into the future. We also consider the performance of the recent attention-based transformer models, which had good success in the image processing and natural language processing domains. In all, we compare four different deep learning methods (Recurrent Neural Network (RNN), Long Short-term Memory (LSTM), Gated Recurrent Units (GRU), and Transformer) along with a baseline method. The dataset (hourly) we used is the Beijing Air Quality Dataset from the website of University of California, Irvine (UCI), which includes a multivariate time series of many factors measured on an hourly basis for a period of 5 years (2010-14). For each model, we also report on the relationship between the performance and the look-back window sizes and the number of predicted time points into the future. Our experiments suggest that Transformer models have the best performance with the lowest Mean Absolute Errors (MAE = 14.599, 23.273) and Root Mean Square Errors (RSME = 23.573, 38.131) for most of our single-step and multi-steps predictions. The best size for the look-back window to predict 1 hour into the future appears to be one day, while 2 or 4 days perform the best to predict 3 hours into the future.
Keywords: Air quality prediction, deep learning algorithms, time series forecasting, look-back window.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11712274 High-Rises and Urban Design: The Reasons for Unsuccessful Placemaking with Residential High-Rises in England
Authors: E. Kalcheva, A. Taki, Y. Hadi
Abstract:
High-rises and placemaking is an understudied combination which receives more and more interest with the proliferation of this typology in many British cities. The reason for studying three major cities in England: London, Birmingham and Manchester, is to learn from the latest advances in urban design in well-developed and prominent urban environment. The analysis of several high-rise sites reveals the weaknesses in urban design of contemporary British cities and presents an opportunity to study from the implemented examples. Therefore, the purpose of this research is to analyze design approaches towards creating a sustainable and varied urban environment when high-rises are involved. The research questions raised by the study are: what is the quality of high-rises and their surroundings; what facilities and features are deployed in the research area; what is the role of the high-rise buildings in the placemaking process; what urban design principles are applicable in this context. The methodology utilizes observation of the researched area by structured questions, developed by the author to evaluate the outdoor qualities of the high-rise surroundings. In this context, the paper argues that the quality of the public realm around the high-rises is quite low, missing basic but vital elements such as plazas, public art, and seating, along with landscaping and pocket parks. There is lack of coherence, the rhythm of the streets is often disrupted, and even though the high-rises are very aesthetically appealing, they fail to create a sense of place on their own. The implications of the study are that future planning can take into consideration the critique in this article and provide more opportunities for urban design interventions around high-rise buildings in the British cities.
Keywords: High-rises, placemaking, urban design, townscape.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20292273 Classification Based on Deep Neural Cellular Automata Model
Authors: Yasser F. Hassan
Abstract:
Deep learning structure is a branch of machine learning science and greet achievement in research and applications. Cellular neural networks are regarded as array of nonlinear analog processors called cells connected in a way allowing parallel computations. The paper discusses how to use deep learning structure for representing neural cellular automata model. The proposed learning technique in cellular automata model will be examined from structure of deep learning. A deep automata neural cellular system modifies each neuron based on the behavior of the individual and its decision as a result of multi-level deep structure learning. The paper will present the architecture of the model and the results of simulation of approach are given. Results from the implementation enrich deep neural cellular automata system and shed a light on concept formulation of the model and the learning in it.Keywords: Cellular automata, neural cellular automata, deep learning, classification.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8662272 An Overview of Nano-Particles Effect on Mechanical Properties of Composites
Authors: Olatunde I. Sekunowo, Stephen I. Durowaye, Ganiyu I. Lawal
Abstract:
Composites depending on the nature of their constituents and mode of production are regarded as one of the advanced materials that drive today’s technology. This paper attempts a short review of the subject matter with a general aim of pushing to the next level the frontier of knowledge as it impacts the technology of nano-particles manufacturing. The objectives entail an effort to; aggregate recent research efforts in this field, analyse research findings and observations, streamline research efforts and support industry in taking decision on areas of fund deployment. It is envisaged that this work will serve as a quick hand-on compendium material for researchers in this field and a guide to relevant government departments wishing to fund a research whose outcomes have the potential of improving the nation’s GDP.
Keywords: Advanced materials, Composites, Mechanical properties, Nano-particles.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 45202271 2.5D Face Recognition Using Gabor Discrete Cosine Transform
Authors: Ali Cheraghian, Farshid Hajati, Soheila Gheisari, Yongsheng Gao
Abstract:
In this paper, we present a novel 2.5D face recognition method based on Gabor Discrete Cosine Transform (GDCT). In the proposed method, the Gabor filter is applied to extract feature vectors from the texture and the depth information. Then, Discrete Cosine Transform (DCT) is used for dimensionality and redundancy reduction to improve computational efficiency. The system is combined texture and depth information in the decision level, which presents higher performance compared to methods, which use texture and depth information, separately. The proposed algorithm is examined on publically available Bosphorus database including models with pose variation. The experimental results show that the proposed method has a higher performance compared to the benchmark.Keywords: Gabor filter, discrete cosine transform, 2.5D face recognition, pose.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17542270 Assessment of Pollution Reduction
Authors: Katarzyna Strzała-Osuch
Abstract:
Environmental investments, including ecological projects, relating to the protection of atmosphere are today a need. However, investing in the environment should be based on rational management rules. This comes across a problem of selecting a method to assess substances reduced during projects. Therefore, a method allowing for the assessment of decision rationality has to be found. The purpose of this article is to present and systematise pollution reduction assessment methods and illustrate theoretical analyses with empirical data. Empirical results confirm theoretical considerations, which proved that the only method for judging pollution reduction, free of apparent disadvantages, is the Eco 99-ratio method. To make decisions on environmental projects, financing institutions should take into account a rationality rule. Therefore the Eco 99-ratio method could be applied to make decisions relating to environmental investments in the area of air protection.Keywords: Assessment of pollution reduction, costs of environmental protection, efficiency of environmental investments.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13272269 Long-term Monitor of Seawater by using TiO2:Ru Sensing Electrode for Hard Clam Cultivation
Authors: Jung-Chuan Chou, Cheng-Wei Chen
Abstract:
The hard clam (meretrix lusoria) cultivated industry has been developed vigorously for recent years in Taiwan, and seawater quality determines the cultivated environment. The pH concentration variation affects survival rate of meretrix lusoria immediately. In order to monitor seawater quality, solid-state sensing electrode of ruthenium-doped titanium dioxide (TiO2:Ru) is developed to measure hydrogen ion concentration in different cultivated solutions. Because the TiO2:Ru sensing electrode has high chemical stability and superior sensing characteristics, thus it is applied as a pH sensor. Response voltages of TiO2:Ru sensing electrode are readout by instrument amplifier in different sample solutions. Mean sensitivity and linearity of TiO2:Ru sensing electrode are 55.20 mV/pH and 0.999 from pH1 to pH13, respectively. We expect that the TiO2:Ru sensing electrode can be applied to real environment measurement, therefore we collect two sample solutions by different meretrix lusoria cultivated ponds in the Yunlin, Taiwan. The two sample solutions are both measured for 200 seconds after calibration of standard pH buffer solutions (pH7, pH8 and pH 9). Mean response voltages of sample 1 and sample 2 are -178.758 mV (Standard deviation=0.427 mV) and -180.206 mV (Standard deviation =0.399 mV), respectively. Response voltages of the two sample solutions are between pH 8 and pH 9 which conform to weak alkali range and suitable meretrix lusoria growth. For long-term monitoring, drift of cultivated solutions (sample 1 and sample 2) are 1.16 mV/hour and 1.03 mV/hour, respectively.Keywords: Co-sputtering system, Hard clam (meretrix lusoria), Ruthenium-doped titanium dioxide, Solid-state sensing electrode.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16432268 Inquiry on the Improvement Teaching Quality in the Classroom with Meta-Teaching Skills
Authors: Shahlan Surat, Saemah Rahman, Saadiah Kummin
Abstract:
When teachers reflect and evaluate whether their teaching methods actually have an impact on students’ learning, they will adjust their practices accordingly. This inevitably improves their students’ learning and performance. The approach in meta-teaching can invigorate and create a passion for teaching. It thus helps to increase the commitment and love for the teaching profession. This study was conducted to determine the level of metacognitive thinking of teachers in the process of teaching and learning in the classroom. Metacognitive thinking teachers include the use of metacognitive knowledge which consists of different types of knowledge: declarative, procedural and conditional. The ability of the teachers to plan, monitor and evaluate the teaching process can also be determined. This study was conducted on 377 graduate teachers in Klang Valley, Malaysia. The stratified sampling method was selected for the purpose of this study. The metacognitive teaching inventory consisting of 24 items is called InKePMG (Teacher Indicators of Effectiveness Meta-Teaching). The results showed the level of mean is high for two components of metacognitive knowledge; declarative knowledge (mean = 4.16) and conditional (mean = 4.11) whereas, the mean of procedural knowledge is 4.00 (moderately high). Similarly, the level of knowledge in monitoring (mean = 4.11), evaluating (mean = 4.00) which indicate high score and planning (mean = 4.00) are moderately high score among teachers. In conclusion, this study shows that the planning and procedural knowledge is an important element in improving the quality of teachers teaching in the classroom. Thus, the researcher recommended that further studies should focus on training programs for teachers on metacognitive skills and also on developing creative thinking among teachers.Keywords: Metacognitive thinking skills, procedural knowledge, conditional knowledge, declarative knowledge, meta-teaching and regulation of cognitive.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14372267 An Intelligent Optimization Model for Multi-objective Order Allocation Planning
Authors: W. K. Wong, Z. X. Guo, P.Y. Mok
Abstract:
This paper presents a multi-objective order allocation planning problem with the consideration of various real-world production features. A novel hybrid intelligent optimization model, integrating a multi-objective memetic optimization process, a Monte Carlo simulation technique and a heuristic pruning technique, is proposed to handle this problem. Experiments based on industrial data are conducted to validate the proposed model. Results show that (1) the proposed model can effectively solve the investigated problem by providing effective production decision-making solutions, which outperformsan NSGA-II-based optimization process and an industrial method.Keywords: Multi-objective order allocation planning, Pareto optimization, Memetic algorithm, Mento Carlo simulation
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16392266 Detecting Major Misconceptions about Employment in ICT: A Study of the Myths about ICT Work among Females
Authors: Eneli Kindsiko, Kulno Türk
Abstract:
The purpose of the current article is to reveal misconceptions about ICT occupations that keep females away from the field. The study focuses on the three phases in one’s career life cycle: pre-university, university and workplace with the aim of investigating how to attract more females into an ICT-related career. By studying nearly 300 secondary school graduates, 102 university students and 18 female ICT specialists, the study revealed six myths that influence the decision-making process of young girls in pursuing an ICT-related education and career. Furthermore, discriminating conception of ICT as a primarily man’s world is developed before the university period. Stereotypical barriers should be brought out to the public debate, so that a remarkable proportion of possible employees (women) would not stay away from the tech-related fields. Countries could make a remarkable leap in efficiency, when turning their attention to the gender-related issues in the labour market structure.Keywords: ICT, women, education, stereotypes, computers.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16402265 Methodologies for Crack Initiation in Welded Joints Applied to Inspection Planning
Authors: Guang Zou, Kian Banisoleiman, Arturo González
Abstract:
Crack initiation and propagation threatens structural integrity of welded joints and normally inspections are assigned based on crack propagation models. However, the approach based on crack propagation models may not be applicable for some high-quality welded joints, because the initial flaws in them may be so small that it may take long time for the flaws to develop into a detectable size. This raises a concern regarding the inspection planning of high-quality welded joins, as there is no generally acceptable approach for modeling the whole fatigue process that includes the crack initiation period. In order to address the issue, this paper reviews treatment methods for crack initiation period and initial crack size in crack propagation models applied to inspection planning. Generally, there are four approaches, by: 1) Neglecting the crack initiation period and fitting a probabilistic distribution for initial crack size based on statistical data; 2) Extrapolating the crack propagation stage to a very small fictitious initial crack size, so that the whole fatigue process can be modeled by crack propagation models; 3) Assuming a fixed detectable initial crack size and fitting a probabilistic distribution for crack initiation time based on specimen tests; and, 4) Modeling the crack initiation and propagation stage separately using small crack growth theories and Paris law or similar models. The conclusion is that in view of trade-off between accuracy and computation efforts, calibration of a small fictitious initial crack size to S-N curves is the most efficient approach.
Keywords: Crack initiation, fatigue reliability, inspection planning, welded joints.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 13982264 Assuming the Decision of Having One (More) Child: The New Dimensions of the Post Communist Romanian Family
Authors: Raluca-Ioana Horea-Şerban, Marinela Istrate
Abstract:
The first part of the paper analyzes the dynamics of the total fertility rate both at the national and regional level, pointing out the regional disparities in the distribution of this indicator. At the same time, we also focus on the collapse of the number of live births, on the changes in the fertility rate by birth rank, as well as on the failure of acquiring the desired number of children. The second part of the study centres upon a survey applied to urban families with 3 and more than 3 offspring. The preliminary analysis highlights the fact that an increased fertility (more than 3rd rank) is triggered by the parents’ above the average material condition and superior education. The current situation of Romania, which is still passing through a period of relatively rapid demographic changes, marked by numerous convulsions, requires a new approach, in compliance with the recent interpretations appropriate to a new post-transitional demographic regime.Keywords: Family size intention, fertility rate, regional disparities, third birth rank.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17192263 Effect of Soaking Period of Clay on Its California Bearing Ratio Value
Authors: Robert G. Nini
Abstract:
The quality of road pavement is affected mostly by the type of sub-grade which is acting as road foundation. The roads degradation is related to many factors especially the climatic conditions, the quality, and the thickness of the base materials. The thickness of this layer depends on its California Bearing Ratio (CBR) test value which by its turn is highly affected by the quantity of water infiltrated under the road after heavy rain. The capacity of the base material to drain out its water is predominant factor because any change in moisture content causes change in sub-grade strength. This paper studies the effect of the soaking period of soil especially clay on its CBR value. For this reason, we collected many clayey samples in order to study the effect of the soaking period on its CBR value. On each soil, two groups of experiments were performed: main tests consisting of Proctor and CBR test from one side and from other side identification tests consisting of other tests such as Atterberg limits tests. Each soil sample was first subjected to Proctor test in order to find its optimum moisture content which will be used to perform the CBR test. Four CBR tests were performed on each soil with different soaking period. The first CBR was done without soaking the soil sample; the second one with two days soaking, the third one with four days soaking period and the last one was done under eight days soaking. By comparing the results of CBR tests performed with different soaking time, a more detailed understanding was given to the role of the water in reducing the CBR of soil. In fact, by extending the soaking period, the CBR was found to be reduced quickly the first two days and slower after. A precise reduction factor of the CBR in relation with soaking period was found at the end of this paper.
Keywords: California bearing ratio, clay, proctor test, soaking period, sub-grade.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8732262 Remaining Useful Life Prediction Using Elliptical Basis Function Network and Markov Chain
Authors: Yi Yu, Lin Ma, Yong Sun, Yuantong Gu
Abstract:
This paper presents a novel method for remaining useful life prediction using the Elliptical Basis Function (EBF) network and a Markov chain. The EBF structure is trained by a modified Expectation-Maximization (EM) algorithm in order to take into account the missing covariate set. No explicit extrapolation is needed for internal covariates while a Markov chain is constructed to represent the evolution of external covariates in the study. The estimated external and the unknown internal covariates constitute an incomplete covariate set which are then used and analyzed by the EBF network to provide survival information of the asset. It is shown in the case study that the method slightly underestimates the remaining useful life of an asset which is a desirable result for early maintenance decision and resource planning.Keywords: Elliptical Basis Function Network, Markov Chain, Missing Covariates, Remaining Useful Life
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16622261 Spatial Distribution and Risk Assessment of As, Hg, Co and Cr in Kaveh Industrial City, using Geostatistic and GIS
Authors: Abbas Hani
Abstract:
The concentrations of As, Hg, Co, Cr and Cd were tested for each soil sample, and their spatial patterns were analyzed by the semivariogram approach of geostatistics and geographical information system technology. Multivariate statistic approaches (principal component analysis and cluster analysis) were used to identify heavy metal sources and their spatial pattern. Principal component analysis coupled with correlation between heavy metals showed that primary inputs of As, Hg and Cd were due to anthropogenic while, Co, and Cr were associated with pedogenic factors. Ordinary kriging was carried out to map the spatial patters of heavy metals. The high pollution sources evaluated was related with usage of urban and industrial wastewater. The results of this study helpful for risk assessment of environmental pollution for decision making for industrial adjustment and remedy soil pollution.Keywords: Geographic Information system, Geostatistics, Kaveh, Multivariate Statistical Analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19812260 Ensembling Classifiers – An Application toImage Data Classification from Cherenkov Telescope Experiment
Authors: Praveen Boinee, Alessandro De Angelis, Gian Luca Foresti
Abstract:
Ensemble learning algorithms such as AdaBoost and Bagging have been in active research and shown improvements in classification results for several benchmarking data sets with mainly decision trees as their base classifiers. In this paper we experiment to apply these Meta learning techniques with classifiers such as random forests, neural networks and support vector machines. The data sets are from MAGIC, a Cherenkov telescope experiment. The task is to classify gamma signals from overwhelmingly hadron and muon signals representing a rare class classification problem. We compare the individual classifiers with their ensemble counterparts and discuss the results. WEKA a wonderful tool for machine learning has been used for making the experiments.Keywords: Ensembles, WEKA, Neural networks [NN], SupportVector Machines [SVM], Random Forests [RF].
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17652259 Design Optimization of Ferrocement-Laminated Plate Using Genetic Algorithm
Authors: M. Rokonuzzaman, Z. Gürdal
Abstract:
This paper describes the design optimization of ferrocement-laminated plate made up of reinforcing steel wire mesh(es) and cement mortar. For the improvement of the designing process, the plate is modeled as a multi-layer medium, dividing the ferrocement plate into layers of mortar and ferrocement. The mortar layers are assumed to be isotropic in nature and the ferrocement layers are assumed to be orthotropic. The ferrocement layers are little stiffer, but much more costlier, than the mortar layers due the presence of steel wire mesh. The optimization is performed for minimum weight design of the laminate using a genetic algorithm. The optimum designs are discussed for different plate configurations and loadings, and it is compared with the worst designs obtained at the final generation. The paper provides a procedure for the designers in decision-making process.
Keywords: Buckling, Ferrocement-Laminated Plate, Genetic Algorithm, Plate Theory.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21862258 Bayesian Network Based Intelligent Pediatric System
Authors: Jagmohan Mago, Parvinder S. Sandhu, Neeru Chawla
Abstract:
In this paper, a Bayesian Network (BN) based system is presented for providing clinical decision support to healthcare practitioners in rural or remote areas of India for young infants or children up to the age of 5 years. The government is unable to appoint child specialists in rural areas because of inadequate number of available pediatricians. It leads to a high Infant Mortality Rate (IMR). In such a scenario, Intelligent Pediatric System provides a realistic solution. The prototype of an intelligent system has been developed that involves a knowledge component called an Intelligent Pediatric Assistant (IPA); and User Agents (UA) along with their Graphical User Interfaces (GUI). The GUI of UA provides the interface to the healthcare practitioner for submitting sign-symptoms and displaying the expert opinion as suggested by IPA. Depending upon the observations, the IPA decides the diagnosis and the treatment plan. The UA and IPA form client-server architecture for knowledge sharing.Keywords: Network, Based Intelligent, Pediatric System
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2217