Search results for: Final MC
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 708

Search results for: Final MC

678 The Effect of Material Properties and Volumetric Changes in Phase Transformation to the Final Residual Stress of Welding Process

Authors: Djarot B. Darmadi

Abstract:

The wider growing Finite Element Method (FEM) application is caused by its benefits of cost saving and environment friendly. Also, by using FEM a deep understanding of certain phenomenon can be achieved. This paper observed the role of material properties and volumetric change when Solid State Phase Transformation (SSPT) takes place in residual stress formation due to a welding process of ferritic steels through coupled Thermo- Metallurgy-Mechanical (TMM) analysis. The correctness of FEM residual stress prediction was validated by experiment. From parametric study of the FEM model, it can be concluded that the material properties change tend to over-predicts residual stress in the weld center whilst volumetric change tend to underestimates it. The best final result is the compromise of both by incorporates them in the model which has a better result compared to a model without SSPT.

Keywords: Residual stress, ferritic steels, SSPT, coupled-TMM.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1980
677 Predicting the Life Cycle of Complex Technical Systems (CTS)

Authors: Khalil A. Yaghi, Samer Barakat

Abstract:

Complex systems are composed of several plain interacting independent entities. Interaction between these entities creates a unified behavior at the global level that cannot be predicted by examining the behavior of any single individual component of the system. In this paper we consider a welded frame of an automobile trailer as a real example of Complex Technical Systems, The purpose of this paper is to introduce a Statistical method for predicting the life cycle of complex technical systems. To organize gathering of primary data for modeling the life cycle of complex technical systems an “Automobile Trailer Frame" were used as a prototype in this research. The prototype represents a welded structure of several pieces. Both information flows underwent a computerized analysis and classification for the acquisition of final results to reach final recommendations for improving the trailers structure and their operational conditions.

Keywords: Complex Technical System (CTS), AutomobileTrailer Frame, Automobile Service.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1232
676 Empirical Study from Final Exams of Computer Science Courses Demystifying the Notion of 'an Average Software Engineer'

Authors: Alex Elentukh

Abstract:

The paper is based on data collected from final exams administered during five years teaching the graduate course in software engineering. The visualization instrument with four distinct personas has been used to improve effectiveness of each class. The study offers a plethora of clues toward students' behavioral preferences. Diversity among students (professional background, physical proximity) is too significant to assume a single face of a learner. This is particularly true for a body of on-line graduate students in computer science. Conclusions of the study (each learner is unique and each class is unique) are extrapolated to demystify the notion of an 'average software engineer'. An immediate direction for an educator is to assure a course applies to a wide audience of very different individuals. On another hand, a student should be clear about his/her abilities and preferences - to follow the most effective learning path.

Keywords: K.3.2 computer & information science education, learner profiling, adaptive learning, software engineering.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 650
675 Comparative Study of Pasting Properties of High Fibre Plantain Based Flour Intended for Diabetic Food (Fufu)

Authors: C. C. Okafor, E. E. Ugwu

Abstract:

A comparative study on the feasibility of producing instant high fibre plantain flour for diabetic fufu by blending soy residence with different plantain (Musa spp) varieties (Horn, false Horn and French), all sieved at 60 mesh, mixed in ratio of 60:40 was analyzed for their passing properties using standard analytical method. Results show that VIIIS60 had the highest peak viscosity (303.75 RVU), Trough value (182.08 RVU), final viscosity (284.50 RVU), and lowest in breakdown viscosity (79.58 RVU), set back value (88.17 RVU), peak time (4.36min), pasting temperature (81.18°C) and differed significantly (p <0.05) from other samples. VIS60 had the lowest in peak viscosity (192.25 RVU), Trough value (112.67 RVU), final viscosity (211.92 RVU), but highest in breakdown viscosity (121.61 RVU), peak time (4.66min) pasting temperature (82.35°C), and differed significantly (p <0.05), from other samples. VIIS60 had the medium peak viscosity (236.67 RVU), Trough value (116.58 RVU), Break down viscosity (120:08 RVU), set back viscosity (167.92 RVU), peak time (4.39min), pasting temp (81.44°C) and differed significantly (p <0.05) from other samples. High final viscosity and low set back values of the French variety with soy residue blended at 60 mesh particle size recommends this french variety and fibre composition as optimum for production of instant plantain soy residue flour blend for production of diabetic fufu. 

Keywords: Plantain, soy residue pasting properties particle size.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2372
674 Effects of Formic Acid on the Chemical State and Morphology of As-synthesized and Annealed ZnO Films

Authors: Chueh-Jung Huang, Chia-Hung Li, Hsueh-Lung Wang, Tsun-Nan Lin

Abstract:

Zinc oxide thin films with various microstructures were grown on substrates by using HCOOH-sols. The reaction mechanism of the sol system was investigated by performing an XPS analysis of as-synthesized films, due to the products of hydrolysis and condensation in the sol system contributing to the chemical state of the as-synthesized films. The chemical structures of the assynthesized films related to the microstructures of the final annealed films were also studied. The results of the Zn 2p3/2, C 1s and O1s XPS patterns indicate that the hydrolysis reaction in the sol system is strongly influenced by the HCOOH agent. The results of XRD and FE-SEM demonstrated the microstructures of the annealed films are related to the content of hydrolyzed zinc hydrate (Zn-OH) species present, and that content of the Zn-OH species in the sol system increases the HCOOH adding, and these Zn-OH species existing in the sol phase are responsible for large ZnO crystallites in the final annealed films.

Keywords: zinc oxide, hydrolysis catalyst, zinc acetate source, formic acid.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1659
673 Co-composting Cow Manure with Food Waste: The Influence of Lipids Content

Authors: Neves, L., Ferreira, V., Oliveira, R.

Abstract:

Addition of an oily waste to a co-composting process of dairy cow manure with food waste, and the influence in the final product was evaluated. Three static composting piles with different substrates concentrations were assessed. Sawdust was also added to all composting piles to attain 60%, humidity at the beginning of the process. In pile 1, the co-substrates were the solid-phase of dairy cow manure, food waste and sawdust as bulking agent. In piles 2 and 3 there was an extra input of oily waste of 7 and 11% of the total volume, respectively, corresponding to 18 and 28% in dry weight. The results showed that the co-composting process was feasible even at the highest fat content. Another positive effect due to the oily waste addition was the requirement of extra humidity, due to the hydrophobic properties of this specific waste, which may imply reduced need of a bulking agent. Moreover, this study shows that composting can be a feasible way of adding value to fatty wastes. The three final composts presented very similar and suitable properties for land application.

Keywords: Cow manure, composting, food waste, lipids content.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1736
672 Application of LSB Based Steganographic Technique for 8-bit Color Images

Authors: Mamta Juneja, Parvinder S. Sandhu, Ekta Walia

Abstract:

Steganography is the process of hiding one file inside another such that others can neither identify the meaning of the embedded object, nor even recognize its existence. Current trends favor using digital image files as the cover file to hide another digital file that contains the secret message or information. One of the most common methods of implementation is Least Significant Bit Insertion, in which the least significant bit of every byte is altered to form the bit-string representing the embedded file. Altering the LSB will only cause minor changes in color, and thus is usually not noticeable to the human eye. While this technique works well for 24-bit color image files, steganography has not been as successful when using an 8-bit color image file, due to limitations in color variations and the use of a colormap. This paper presents the results of research investigating the combination of image compression and steganography. The technique developed starts with a 24-bit color bitmap file, then compresses the file by organizing and optimizing an 8-bit colormap. After the process of compression, a text message is hidden in the final, compressed image. Results indicate that the final technique has potential of being useful in the steganographic world.

Keywords: Compression, Colormap, Encryption, Steganographyand LSB Insertion.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 3000
671 A Sentence-to-Sentence Relation Network for Recognizing Textual Entailment

Authors: Isaac K. E. Ampomah, Seong-Bae Park, Sang-Jo Lee

Abstract:

Over the past decade, there have been promising developments in Natural Language Processing (NLP) with several investigations of approaches focusing on Recognizing Textual Entailment (RTE). These models include models based on lexical similarities, models based on formal reasoning, and most recently deep neural models. In this paper, we present a sentence encoding model that exploits the sentence-to-sentence relation information for RTE. In terms of sentence modeling, Convolutional neural network (CNN) and recurrent neural networks (RNNs) adopt different approaches. RNNs are known to be well suited for sequence modeling, whilst CNN is suited for the extraction of n-gram features through the filters and can learn ranges of relations via the pooling mechanism. We combine the strength of RNN and CNN as stated above to present a unified model for the RTE task. Our model basically combines relation vectors computed from the phrasal representation of each sentence and final encoded sentence representations. Firstly, we pass each sentence through a convolutional layer to extract a sequence of higher-level phrase representation for each sentence from which the first relation vector is computed. Secondly, the phrasal representation of each sentence from the convolutional layer is fed into a Bidirectional Long Short Term Memory (Bi-LSTM) to obtain the final sentence representations from which a second relation vector is computed. The relations vectors are combined and then used in then used in the same fashion as attention mechanism over the Bi-LSTM outputs to yield the final sentence representations for the classification. Experiment on the Stanford Natural Language Inference (SNLI) corpus suggests that this is a promising technique for RTE.

Keywords: Deep neural models, natural language inference, recognizing textual entailment, sentence-to-sentence relation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1454
670 Developing an Advanced Algorithm Capable of Classifying News, Articles and Other Textual Documents Using Text Mining Techniques

Authors: R. B. Knudsen, O. T. Rasmussen, R. A. Alphinas

Abstract:

The reason for conducting this research is to develop an algorithm that is capable of classifying news articles from the automobile industry, according to the competitive actions that they entail, with the use of Text Mining (TM) methods. It is needed to test how to properly preprocess the data for this research by preparing pipelines which fits each algorithm the best. The pipelines are tested along with nine different classification algorithms in the realm of regression, support vector machines, and neural networks. Preliminary testing for identifying the optimal pipelines and algorithms resulted in the selection of two algorithms with two different pipelines. The two algorithms are Logistic Regression (LR) and Artificial Neural Network (ANN). These algorithms are optimized further, where several parameters of each algorithm are tested. The best result is achieved with the ANN. The final model yields an accuracy of 0.79, a precision of 0.80, a recall of 0.78, and an F1 score of 0.76. By removing three of the classes that created noise, the final algorithm is capable of reaching an accuracy of 94%.

Keywords: Artificial neural network, competitive dynamics, logistic regression, text classification, text mining.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 535
669 Variable Rate Superorthogonal Turbo Code with the OVSF Code Tree

Authors: Insah Bhurtah, P. Clarel Catherine, K. M. Sunjiv Soyjaudah

Abstract:

When using modern Code Division Multiple Access (CDMA) in mobile communications, the user must be able to vary the transmission rate of users to allocate bandwidth efficiently. In this work, Orthogonal Variable Spreading Factor (OVSF) codes are used with the same principles applied in a low-rate superorthogonal turbo code due to their variable-length properties. The introduced system is the Variable Rate Superorthogonal Turbo Code (VRSTC) where puncturing is not performed on the encoder’s final output but rather before selecting the output to achieve higher rates. Due to bandwidth expansion, the codes outperform an ordinary turbo code in the AWGN channel. Simulations results show decreased performance compared to those obtained with the employment of Walsh-Hadamard codes. However, with OVSF codes, the VRSTC system keeps the orthogonality of codewords whilst producing variable rate codes contrary to Walsh-Hadamard codes where puncturing is usually performed on the final output.

Keywords: CDMA, MAP Decoding, OVSF, Superorthogonal Turbo Code.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2176
668 Potential of Exopolysaccharides in Yoghurt Production

Authors: Jana Feldmane, Pavels Semjonovs, Inga Ciprovica

Abstract:

Consumer demand for products with low fat or sugar content and low levels of food additives, as well as cost factors, make exopolysaccharides (EPS) a viable alternative. EPS remain an interesting tool to modulate the sensory properties of yoghurt. This study was designed to evaluate EPS production potential of commercial yoghurt starter cultures (Yo-Flex starters: Harmony 1.0, TWIST 1.0 and YF-L902, Chr.Hansen, Denmark) and their influence on an apparent viscosity of yoghurt samples. The production of intracellularly synthesized EPS by different commercial yoghurt starters varies roughly from 144,08 to 440,81 mg/l. Analysing starters’ producing EPS, they showed large variations in concentration and supposedly composition. TWIST 1.0 had produced greater amounts of EPS in MRS medium and in yoghurt samples but there wasn’t determined significant contribution to development of texture as well as an apparent viscosity of the final product. YF-L902 and Harmony 1.0 starters differed considerably in EPS yields, but not in apparent viscosities (p>0.05) of the final yoghurts. Correlation between EPS concentration and viscosity of yoghurt samples was not established in the study.

Keywords: Exopolysaccharides, yoghurt starters, apparent viscosity.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 5021
667 Sustainable Energy Supply in Social Housing

Authors: Rolf Katzenbach, Frithjof Clauss, Jie Zheng

Abstract:

The final energy use can be divided mainly in four sectors: commercial, industrial, residential, and transportation. The trend in final energy consumption by sector plays as a most straightforward way to provide a wide indication of progress for reducing energy consumption and associated environmental impacts by different end use sectors. The average share of end use energy for residential sector in the world was nearly 20% until 2011, in Germany a higher proportion is between 25% and 30%. However, it remains less studied than energy use in other three sectors as well its impacts on climate and environment. The reason for this involves a wide range of fields, including the diversity of residential construction like different housing building design and materials, living or energy using behavioral patterns, climatic condition and variation as well other social obstacles, market trend potential and financial support from government.

This paper presents an extensive and in-depth analysis of the manner by which projects researched and operated by authors in the fields of energy efficiency primarily from the perspectives of both technical potential and initiative energy saving consciousness in the residential sectors especially in social housing buildings.

Keywords: Energy Efficiency, Renewable Energy, Retro-commissioning, Social Housing, Sustainability.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2440
666 Effect of Flour Concentration and Retrogradation Treatment on Physical Properties of Instant Sinlek Brown Rice

Authors: Supat Chaiyakul, Direk Sukkasem, Patnachapa Natthapanpaisith

Abstract:

Sinlek rice flour beverage or instant product is a dietary supplement for dysphagia, or difficulty swallowing. It is also consumed by individuals who need to consume supplements to maintain their calorific needs. This product provides protein, fat, iron, and a high concentration of carbohydrate from rice flour. However, the application of native flour is limited due to its high viscosity. Starch modification by controlling starch retrogradation was used in this study. The research studies the effects of rice flour concentration and retrogradation treatment on the physical properties of instant Sinlek brown rice. The native rice flour, gelatinized rice flour, and flour gels retrograded under 4 °C for 3 and 7 days were investigated. From the statistical results, significant differences between native and retrograded flour were observed. The concentration of rice flour was the main factor influencing the swelling power, solubility, and pasting properties. With the increase in rice flour content from 10 to 15%, swelling power, peak viscosity, trough, and final viscosity decreased; but, solubility, pasting temperature, peak time, breakdown, and setback increased. The peak time, pasting temperature, peak viscosity, trough, and final viscosity decreased as the storage period increased from 3 to 7 days. The retrograded rice flour powders had lower pasting temperature, peak viscosity, breakdown, and final viscosity than the gelatinized and native flour powders. Reduction of starch viscosity by gelatinization and controlling starch retrogradation could allow for increased quantities of rice flour in instant rice beverages. Also, the treatment could increase the energy and nutrient densities of rice beverages without affecting the viscosity of this product.

Keywords: Instant rice, pasting properties, pregelatinization, retrogradation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1626
665 Flocculation on the Treatment of Olive Oil Mill Wastewater: Pretreatment

Authors: G. Hodaifa, J. A. Páez, C. Agabo, E. Ramos, J. C. Gutiérrez, A. Rosal

Abstract:

Currently, continuous two-phase decanter process used for olive oil production is the more internationally widespread. The wastewaters generated from this industry (OMW) are a real environmental problem because of its high organic load. Among proposed treatments for these wastewaters, advanced oxidation technologies (Fenton, ozone, photoFenton, etc.) are the most favourable. The direct application of these processes is somewhat expensive. Therefore, the application of a previous stage based on a flocculation-sedimentation operation is of high importance. In this research five commercial flocculants (three cationic, and two anionic) have been used to achieve the separation of phases (liquid clarifiedsludge). For each flocculant, different concentrations (0-1000 mg/L) have been studied. In these experiments, sludge volume formed and the final water quality were determined. The final removal percentages of total phenols (11.3-25.1%), COD (5.6-20.4%), total carbon (2.3-26.5%), total organic carbon (1.50-23.8%), total nitrogen (1.45-24.8%), and turbidity (27.9-61.4%) were determined. The variation on electric conductivity reduction percentage (1-8%) was also determined. Finally, the best flocculants with highest removal percentages have been determined (QG2001 and Flocudex CS49).

Keywords: Flocculants, flocculation, olive oil mill wastewater, water quality.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2553
664 A Two-Stage Multi-Agent System to Predict the Unsmoothed Monthly Sunspot Numbers

Authors: Mak Kaboudan

Abstract:

A multi-agent system is developed here to predict monthly details of the upcoming peak of the 24th solar magnetic cycle. While studies typically predict the timing and magnitude of cycle peaks using annual data, this one utilizes the unsmoothed monthly sunspot number instead. Monthly numbers display more pronounced fluctuations during periods of strong solar magnetic activity than the annual sunspot numbers. Because strong magnetic activities may cause significant economic damages, predicting monthly variations should provide different and perhaps helpful information for decision-making purposes. The multi-agent system developed here operates in two stages. In the first, it produces twelve predictions of the monthly numbers. In the second, it uses those predictions to deliver a final forecast. Acting as expert agents, genetic programming and neural networks produce the twelve fits and forecasts as well as the final forecast. According to the results obtained, the next peak is predicted to be 156 and is expected to occur in October 2011- with an average of 136 for that year.

Keywords: Computational techniques, discrete wavelet transformations, solar cycle prediction, sunspot numbers.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1327
663 Effect of Type of Pile and Its Installation Method on Pile Bearing Capacity by Physical Modeling in Frustum Confining Vessel

Authors: Seyed Abolhasan Naeini, M. Mortezaee

Abstract:

Various factors such as the method of installation, the pile type, the pile material and the pile shape, can affect the final bearing capacity of a pile executed in the soil; among them, the method of installation is of special importance. The physical modeling is among the best options in the laboratory study of the piles behavior. Therefore, the current paper first presents and reviews the frustum confining vessel (FCV) as a suitable tool for physical modeling of deep foundations. Then, by describing the loading tests of two open-ended and closed-end steel piles, each of which has been performed in two methods, “with displacement" and "without displacement", the effect of end conditions and installation method on the final bearing capacity of the pile is investigated. The soil used in the current paper is silty sand of Firuzkuh, Iran. The results of the experiments show that in general the without displacement installation method has a larger bearing capacity in both piles, and in a specific method of installation the closed ended pile shows a slightly higher bearing capacity.

Keywords: physical modeling, frustum confining vessel, pile, bearing capacity, installation method

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 500
662 Capability Prediction of Machining Processes Based on Uncertainty Analysis

Authors: Hamed Afrasiab, Saeed Khodaygan

Abstract:

Prediction of machining process capability in the design stage plays a key role to reach the precision design and manufacturing of mechanical products. Inaccuracies in machining process lead to errors in position and orientation of machined features on the part, and strongly affect the process capability in the final quality of the product. In this paper, an efficient systematic approach is given to investigate the machining errors to predict the manufacturing errors of the parts and capability prediction of corresponding machining processes. A mathematical formulation of fixture locators modeling is presented to establish the relationship between the part errors and the related sources. Based on this method, the final machining errors of the part can be accurately estimated by relating them to the combined dimensional and geometric tolerances of the workpiece – fixture system. This method is developed for uncertainty analysis based on the Worst Case and statistical approaches. The application of the presented method is illustrated through presenting an example and the computational results are compared with the Monte Carlo simulation results.

Keywords: Process capability, machining error, dimensional and geometrical tolerances, uncertainty analysis.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1234
661 Main Control Factors of Fluid Loss in Drilling and Completion in Shunbei Oilfield by Unmanned Intervention Algorithm

Authors: Peng Zhang, Lihui Zheng, Xiangchun Wang, Xiaopan Kou

Abstract:

Quantitative research on the main control factors of lost circulation has few considerations and single data source. Using Unmanned Intervention Algorithm to find the main control factors of lost circulation adopts all measurable parameters. The degree of lost circulation is characterized by the loss rate as the objective function. Geological, engineering and fluid data are used as layers, and 27 factors such as wellhead coordinates and Weight on Bit (WOB) used as dimensions. Data classification is implemented to determine function independent variables. The mathematical equation of loss rate and 27 influencing factors is established by multiple regression method, and the undetermined coefficient method is used to solve the undetermined coefficient of the equation. Only three factors in t-test are greater than the test value 40, and the F-test value is 96.557%, indicating that the correlation of the model is good. The funnel viscosity, final shear force and drilling time were selected as the main control factors by elimination method, contribution rate method and functional method. The calculated values of the two wells used for verification differ from the actual values by -3.036 m3/h and -2.374 m3/h, with errors of 7.21% and 6.35%. The influence of engineering factors on the loss rate is greater than that of funnel viscosity and final shear force, and the influence of the three factors is less than that of geological factors. The best combination of funnel viscosity, final shear force and drilling time is obtained through quantitative calculation. The minimum loss rate of lost circulation wells in Shunbei area is 10 m3/h. It can be seen that man-made main control factors can only slow down the leakage, but cannot fundamentally eliminate it. This is more in line with the characteristics of karst caves and fractures in Shunbei fault solution oil and gas reservoir.

Keywords: Drilling fluid, loss rate, main controlling factors, Unmanned Intervention Algorithm.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 401
660 Investigation of the Capability of REALP5 to Solve Complex Fuel Geometry

Authors: D. Abdelrazek, M. NaguibAly, A. A. Badawi, Asmaa G. Abo Elnour, A. A. El-Kafas

Abstract:

This work is developed within IAEA Coordinated Research Program 1496, “Innovative methods in research reactor analysis: Benchmark against experimental data on neutronics and thermal-hydraulic computational methods and tools for operation and safety analysis of research reactors”.

The study investigates the capability of Code RELAP5/Mod3.4 to solve complex geometry complexity. Its results are compared to the results of PARET, a common code in thermal hydraulic analysis for research reactors, belonging to MTR-PC groups.

The WWR-SM reactor at the Institute of Nuclear Physics (INP) in the Republic of Uzbekistan is simulated using both PARET and RELAP5 at steady state. Results from the two codes are compared.

REALP5 code succeeded in solving the complex fuel geometry. The PARET code needed some calculations to obtain the final result. Although the final results from the PARET are more accurate, the small differences in both results makes using RELAP5 code recommended in case of complex fuel assemblies. 

Keywords: Complex fuel geometry, PARET, RELAP5, WWR-SM reactor.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2252
659 Roles and Responsibilities to Success of IT Project in an Organization

Authors: Vahhab Attar Olyaee, Fouad Attar Olyaee

Abstract:

Many IT projects come to failure because of having technical approach, focusing on the final product and lack of proper attention to strategic alignment. Project management models quite often have technical management view [4], [8], [13], [14]. These models focus greatly on the finalization of the project product and the delivery of the product to the customer. However, many project problems are due to lack of attention to the needs and capabilities of the organizations or disregarding how to deploy and use the product in the organization. In this regard, in the current research we are trying to present a solution with the purpose of raising the value of the project in an organization. This way, the project outputs will be properly deployed in the organization. Therefore, a comprehensive model is presented which takes into account the whole processes from initial step of project definition to the deployment of the final outputs in the organization and then the definition of all roles and responsibilities to put the model into practice. Taking into account the opinions of experts and project managers, to prove the performance of the model, the project problems were recognized and based on the model, categorized and analyzed. And at the end it is made clear that ignoring the proper definition of the project and not having a proper understanding of the expected value on the one hand and not supervising the emerged value in the process of production and installment are among the most important factors that bring a project to failure.

Keywords: IT Governance, Project Model, Roles and Responsibilities of Project

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1595
658 Urban Flood Control and Management - An Integrated Approach

Authors: Ranjan Sarukkalige, Joseph Sanjaya Ma

Abstract:

Flood management is one of the important fields in urban storm water management. Floods are influenced by the increase of huge storm event, or improper planning of the area. This study mainly provides the flood protection in four stages; planning, flood event, responses and evaluation. However it is most effective then flood protection is considered in planning/design and evaluation stages since both stages represent the land development of the area. Structural adjustments are often more reliable than nonstructural adjustments in providing flood protection, however structural adjustments are constrained by numerous factors such as political constraints and cost. Therefore it is important to balance both adjustments with the situation. The technical decisions provided will have to be approved by the higher-ups who have the power to decide on the final solution. Costs however, are the biggest factor in determining the final decision. Therefore this study recommends flood protection system should have been integrated and enforces more in the early stages (planning and design) as part of the storm water management plan. Factors influencing the technical decisions provided should be reduced as low as possible to avoid a reduction in the expected performance of the proposed adjustments.

Keywords: Urban Flood, flood protection, water management, storm water, cost,

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1522
657 Opinion Mining Framework in the Education Domain

Authors: A. M. H. Elyasir, K. S. M. Anbananthen

Abstract:

The internet is growing larger and becoming the most popular platform for the people to share their opinion in different interests. We choose the education domain specifically comparing some Malaysian universities against each other. This comparison produces benchmark based on different criteria shared by the online users in various online resources including Twitter, Facebook and web pages. The comparison is accomplished using opinion mining framework to extract, process the unstructured text and classify the result to positive, negative or neutral (polarity). Hence, we divide our framework to three main stages; opinion collection (extraction), unstructured text processing and polarity classification. The extraction stage includes web crawling, HTML parsing, Sentence segmentation for punctuation classification, Part of Speech (POS) tagging, the second stage processes the unstructured text with stemming and stop words removal and finally prepare the raw text for classification using Named Entity Recognition (NER). Last phase is to classify the polarity and present overall result for the comparison among the Malaysian universities. The final result is useful for those who are interested to study in Malaysia, in which our final output declares clear winners based on the public opinions all over the web.

Keywords: Entity Recognition, Education Domain, Opinion Mining, Unstructured Text.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 2965
656 Highly Scalable, Reversible and Embedded Image Compression System

Authors: Federico Pérez González, Iñaki Goiricelaia Ordorika, Pedro Iriondo Bengoa

Abstract:

A new method for low complexity image coding is presented, that permits different settings and great scalability in the generation of the final bit stream. This coding presents a continuoustone still image compression system that groups loss and lossless compression making use of finite arithmetic reversible transforms. Both transformation in the space of color and wavelet transformation are reversible. The transformed coefficients are coded by means of a coding system in depending on a subdivision into smaller components (CFDS) similar to the bit importance codification. The subcomponents so obtained are reordered by means of a highly configure alignment system depending on the application that makes possible the re-configure of the elements of the image and obtaining different levels of importance from which the bit stream will be generated. The subcomponents of each level of importance are coded using a variable length entropy coding system (VBLm) that permits the generation of an embedded bit stream. This bit stream supposes itself a bit stream that codes a compressed still image. However, the use of a packing system on the bit stream after the VBLm allows the realization of a final highly scalable bit stream from a basic image level and one or several enhance levels.

Keywords: Image compression, wavelet transform, highlyscalable, reversible transform, embedded, subcomponents.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1412
655 Reversible, Embedded and Highly Scalable Image Compression System

Authors: Federico Pérez González, Iñaki Goirizelaia Ordorika, Pedro Iriondo Bengoa

Abstract:

In this work a new method for low complexity image coding is presented, that permits different settings and great scalability in the generation of the final bit stream. This coding presents a continuous-tone still image compression system that groups loss and lossless compression making use of finite arithmetic reversible transforms. Both transformation in the space of color and wavelet transformation are reversible. The transformed coefficients are coded by means of a coding system in depending on a subdivision into smaller components (CFDS) similar to the bit importance codification. The subcomponents so obtained are reordered by means of a highly configure alignment system depending on the application that makes possible the re-configure of the elements of the image and obtaining different importance levels from which the bit stream will be generated. The subcomponents of each importance level are coded using a variable length entropy coding system (VBLm) that permits the generation of an embedded bit stream. This bit stream supposes itself a bit stream that codes a compressed still image. However, the use of a packing system on the bit stream after the VBLm allows the realization of a final highly scalable bit stream from a basic image level and one or several improvement levels.

Keywords: Image compression, wavelet transform, highly scalable, reversible transform, embedded, subcomponents.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1300
654 Hybrid Authentication System Using QR Code with OTP

Authors: Salim Istyaq

Abstract:

As we know, number of Internet users are increasing drastically. Now, people are using different online services provided by banks, colleges/schools, hospitals, online utility, bill payment and online shopping sites. To access online services, text-based authentication system is in use. The text-based authentication scheme faces some drawbacks with usability and security issues that bring troubles to users. The core element of computational trust is identity. The aim of the paper is to make the system more compliable for the imposters and more reliable for the users, by using the graphical authentication approach. In this paper, we are using the more powerful tool of encoding the options in graphical QR format and also there will be the acknowledgment which will send to the user’s mobile for final verification. The main methodology depends upon the encryption option and final verification by confirming a set of pass phrase on the legal users, the outcome of the result is very powerful as it only gives the result at once when the process is successfully done. All processes are cross linked serially as the output of the 1st process, is the input of the 2nd and so on. The system is a combination of recognition and pure recall based technique. Presented scheme is useful for devices like PDAs, iPod, phone etc. which are more handy and convenient to use than traditional desktop computer systems.

Keywords: Graphical Password, OTP, QR Codes, Recognition based graphical user authentication, usability and security.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1661
653 Modeling of Processes Running in Radical Clusters Formed by Ionizing Radiation with the Help of Continuous Petri Nets and Oxygen Effect

Authors: J. Barilla, M. Lokajíček, H. Pisaková, P. Simr

Abstract:

The final biological effect of ionizing particles may be influenced strongly by some chemical substances present in cells mainly in the case of low-LET radiation. The influence of oxygen may by particularly important because oxygen is always present in living cells. The corresponding processes are then running mainly in the chemical stage of radiobiological mechanism.

The radical clusters formed by densely ionizing ends of primary or secondary charged particles are mainly responsible for final biological effect. The damage effect depends then on radical concentration at a time when the cluster meets a DNA molecule. It may be strongly influenced by oxygen present in a cell as oxygen may act in different directions: at small concentration of it the interaction with hydrogen radicals prevails while at higher concentrations additional efficient oxygen radicals may be formed.

The basic radical concentration in individual clusters diminishes, which is influenced by two parallel processes: chemical reactions and diffusion of corresponding clusters. The given simultaneous evolution may be modeled and analyzed well with the help of Continuous Petri nets. The influence of other substances present in cells during irradiation may be studied, too. Some results concerning the impact of oxygen content will be presented.

Keywords: DSB formation, chemical stage, Petri nets, radiobiological mechanism.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1574
652 Can Exams Be Shortened? Using a New Empirical Approach to Test in Finance Courses

Authors: Eric S. Lee, Connie Bygrave, Jordan Mahar, Naina Garg, Suzanne Cottreau

Abstract:

Marking exams is universally detested by lecturers. Final exams in many higher education courses often last 3.0 hrs. Do exams really need to be so long? Can we justifiably reduce the number of questions on them? Surprisingly few have researched these questions, arguably because of the complexity and difficulty of using traditional methods. To answer these questions empirically, we used a new approach based on three key elements: Use of an unusual variation of a true experimental design, equivalence hypothesis testing, and an expanded set of six psychometric criteria to be met by any shortened exam if it is to replace a current 3.0-hr exam (reliability, validity, justifiability, number of exam questions, correspondence, and equivalence). We compared student performance on each official 3.0-hr exam with that on five shortened exams having proportionately fewer questions (2.5, 2.0, 1.5, 1.0, and 0.5 hours) in a series of four experiments conducted in two classes in each of two finance courses (224 students in total). We found strong evidence that, in these courses, shortening of final exams to 2.0 hrs was warranted on all six psychometric criteria. Shortening these exams by one hour should result in a substantial one-third reduction in lecturer time and effort spent marking, lower student stress, and more time for students to prepare for other exams. Our approach provides a relatively simple, easy-to-use methodology that lecturers can use to examine the effect of shortening their own exams.

Keywords: Exam length, psychometric criteria, synthetic experimental designs, test length.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1503
651 A Two-Stage Expert System for Diagnosis of Leukemia Based on Type-2 Fuzzy Logic

Authors: Ali Akbar Sadat Asl

Abstract:

Diagnosis and deciding about diseases in medical fields is facing innate uncertainty which can affect the whole process of treatment. This decision is made based on expert knowledge and the way in which an expert interprets the patient's condition, and the interpretation of the various experts from the patient's condition may be different. Fuzzy logic can provide mathematical modeling for many concepts, variables, and systems that are unclear and ambiguous and also it can provide a framework for reasoning, inference, control, and decision making in conditions of uncertainty. In systems with high uncertainty and high complexity, fuzzy logic is a suitable method for modeling. In this paper, we use type-2 fuzzy logic for uncertainty modeling that is in diagnosis of leukemia. The proposed system uses an indirect-direct approach and consists of two stages: In the first stage, the inference of blood test state is determined. In this step, we use an indirect approach where the rules are extracted automatically by implementing a clustering approach. In the second stage, signs of leukemia, duration of disease until its progress and the output of the first stage are combined and the final diagnosis of the system is obtained. In this stage, the system uses a direct approach and final diagnosis is determined by the expert. The obtained results show that the type-2 fuzzy expert system can diagnose leukemia with the average accuracy about 97%.

Keywords: Expert system, leukemia, medical diagnosis, type-2 fuzzy logic.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1053
650 Achieving Design-Stage Elemental Cost Planning Accuracy: Case Study of New Zealand

Authors: Johnson Adafin, James O. B. Rotimi, Suzanne Wilkinson, Abimbola O. Windapo

Abstract:

An aspect of client expenditure management that requires attention is the level of accuracy achievable in design-stage elemental cost planning. This has been a major concern for construction clients and practitioners in New Zealand (NZ). Pre-tender estimating inaccuracies are significantly influenced by the level of risk information available to estimators. Proper cost planning activities should ensure the production of a project’s likely construction costs (initial and final), and subsequent cost control activities should prevent unpleasant consequences of cost overruns, disputes and project abandonment. If risks were properly identified and priced at the design stage, observed variance between design-stage elemental cost plans (ECPs) and final tender sums (FTS) (initial contract sums) could be reduced. This study investigates the variations between design-stage ECPs and FTS of construction projects, with a view to identifying risk factors that are responsible for the observed variance. Data were sourced through interviews, and risk factors were identified by using thematic analysis. Access was obtained to project files from the records of study participants (consultant quantity surveyors), and document analysis was employed in complementing the responses from the interviews. Study findings revealed the discrepancies between ECPs and FTS in the region of -14% and +16%. It is opined in this study that the identified risk factors were responsible for the variability observed. The values obtained from the analysis would enable greater accuracy in the forecast of FTS by Quantity Surveyors. Further, whilst inherent risks in construction project developments are observed globally, these findings have important ramifications for construction projects by expanding existing knowledge on what is needed for reasonable budgetary performance and successful delivery of construction projects. The findings contribute significantly to the study by providing quantitative confirmation to justify the theoretical conclusions generated in the literature from around the world. This therefore adds to and consolidates existing knowledge.

Keywords: Accuracy, design-stage, elemental cost plan, final tender sum, New Zealand.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1804
649 Low Energy Technology for Leachate Valorisation

Authors: Jesús M. Martín, Francisco Corona, Dolores Hidalgo

Abstract:

Landfills present long-term threats to soil, air, groundwater and surface water due to the formation of greenhouse gases (methane gas and carbon dioxide) and leachate from decomposing garbage. The composition of leachate differs from site to site and also within the landfill. The leachates alter with time (from weeks to years) since the landfilled waste is biologically highly active and their composition varies. Mainly, the composition of the leachate depends on factors such as characteristics of the waste, the moisture content, climatic conditions, degree of compaction and the age of the landfill. Therefore, the leachate composition cannot be generalized and the traditional treatment models should be adapted in each case. Although leachate composition is highly variable, what different leachates have in common is hazardous constituents and their potential eco-toxicological effects on human health and on terrestrial ecosystems. Since leachate has distinct compositions, each landfill or dumping site would represent a different type of risk on its environment. Nevertheless, leachates consist always of high organic concentration, conductivity, heavy metals and ammonia nitrogen. Leachate could affect the current and future quality of water bodies due to uncontrolled infiltrations. Therefore, control and treatment of leachate is one of the biggest issues in urban solid waste treatment plants and landfills design and management. This work presents a treatment model that will be carried out "in-situ" using a cost-effective novel technology that combines solar evaporation/condensation plus forward osmosis. The plant is powered by renewable energies (solar energy, biomass and residual heat), which will minimize the carbon footprint of the process. The final effluent quality is very high, allowing reuse (preferred) or discharge into watercourses. In the particular case of this work, the final effluents will be reused for cleaning and gardening purposes. A minority semi-solid residual stream is also generated in the process. Due to its special composition (rich in metals and inorganic elements), this stream will be valorized in ceramic industries to improve the final products characteristics.

Keywords: Forward osmosis, landfills, leachate valorization, solar evaporation.

Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 955