Search results for: Statistical tool.
1380 SBTAR: An Enhancing Method for Automate Test Tools
Authors: Noppakit Nawalikit, Pattarasinee Bhattarakosol
Abstract:
Since Software testing becomes an important part of Software development in order to improve the quality of software, many automation tools are created to help testing functionality of software. There are a few issues about usability of these tools, one is that the result log which is generated from tools contains useless information that the tester cannot use result log to communicate efficiently, or the result log needs to use a specific application to open. This paper introduces a new method, SBTAR that improves usability of automated test tools in a part of a result log. The practice will use the capability of tools named as IBM Rational Robot to create a customized function, the function would generate new format of a result log which contains useful information faster and easier to understand than using the original result log which was generated from the tools. This result log also increases flexibility by Microsoft Word or WordPad to make them readable.Keywords: Software Automation Testing, Automated test tool, IBM Rational Robot.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 14371379 Automated Detection of Alzheimer Disease Using Region Growing technique and Artificial Neural Network
Authors: B. Al-Naami, N. Gharaibeh, A. AlRazzaq Kheshman
Abstract:
Alzheimer is known as the loss of mental functions such as thinking, memory, and reasoning that is severe enough to interfere with a person's daily functioning. The appearance of Alzheimer Disease symptoms (AD) are resulted based on which part of the brain has a variety of infection or damage. In this case, the MRI is the best biomedical instrumentation can be ever used to discover the AD existence. Therefore, this paper proposed a fusion method to distinguish between the normal and (AD) MRIs. In this combined method around 27 MRIs collected from Jordanian Hospitals are analyzed based on the use of Low pass -morphological filters to get the extracted statistical outputs through intensity histogram to be employed by the descriptive box plot. Also, the artificial neural network (ANN) is applied to test the performance of this approach. Finally, the obtained result of t-test with confidence accuracy (95%) has compared with classification accuracy of ANN (100 %). The robust of the developed method can be considered effectively to diagnose and determine the type of AD image.Keywords: Alzheimer disease, Brain MRI analysis, Morphological filter, Box plot, Intensity histogram, ANN.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 31421378 Improvement of Photoluminescence Uniformity of Porous Silicon by using Stirring Anodization Process
Authors: Jia-Chuan Lin, Meng-Kai Hsu, Hsi-Ting Hou, Sin-Hong Liu
Abstract:
The electrolyte stirring method of anodization etching process for manufacturing porous silicon (PS) is reported in this work. Two experimental setups of nature air stirring (PS-ASM) and electrolyte stirring (PS-ESM) are employed to clarify the influence of stirring mechanisms on electrochemical etching process. Compared to traditional fabrication without any stirring apparatus (PS-TM), a large plateau region of PS surface structure is obtained from samples with both stirring methods by the 3D-profiler measurement. Moreover, the light emission response is also improved by both proposed electrolyte stirring methods due to the cycling force in electrolyte could effectively enhance etch-carrier distribution while the electrochemical etching process is made. According to the analysis of statistical calculation of photoluminescence (PL) intensity, lower standard deviations are obtained from PS-samples with studied stirring methods, i.e. the uniformity of PL-intensity is effectively improved. The calculated deviations of PL-intensity are 93.2, 74.5 and 64, respectively, for PS-TM, PS-ASM and PS-ESM.Keywords: Porous Silicon, Photoluminescence, Uniformity Carrier Stirring Method
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18261377 Cryptographic Attack on Lucas Based Cryptosystems Using Chinese Remainder Theorem
Authors: Tze Jin Wong, Lee Feng Koo, Pang Hung Yiu
Abstract:
Lenstra’s attack uses Chinese remainder theorem as a tool and requires a faulty signature to be successful. This paper reports on the security responses of fourth and sixth order Lucas based (LUC4,6) cryptosystem under the Lenstra’s attack as compared to the other two Lucas based cryptosystems such as LUC and LUC3 cryptosystems. All the Lucas based cryptosystems were exposed mathematically to the Lenstra’s attack using Chinese Remainder Theorem and Dickson polynomial. Result shows that the possibility for successful Lenstra’s attack is less against LUC4,6 cryptosystem than LUC3 and LUC cryptosystems. Current study concludes that LUC4,6 cryptosystem is more secure than LUC and LUC3 cryptosystems in sustaining against Lenstra’s attack.Keywords: Lucas sequence, Dickson Polynomial, faulty signature, corresponding signature, congruence.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7321376 Oncogene Identification using Filter based Approaches between Various Cancer Types in Lung
Authors: Michael Netzer, Michael Seger, Mahesh Visvanathan, Bernhard Pfeifer, Gerald H. Lushington, Christian Baumgartner
Abstract:
Lung cancer accounts for the most cancer related deaths for men as well as for women. The identification of cancer associated genes and the related pathways are essential to provide an important possibility in the prevention of many types of cancer. In this work two filter approaches, namely the information gain and the biomarker identifier (BMI) are used for the identification of different types of small-cell and non-small-cell lung cancer. A new method to determine the BMI thresholds is proposed to prioritize genes (i.e., primary, secondary and tertiary) using a k-means clustering approach. Sets of key genes were identified that can be found in several pathways. It turned out that the modified BMI is well suited for microarray data and therefore BMI is proposed as a powerful tool for the search for new and so far undiscovered genes related to cancer.
Keywords: lung cancer, micro arrays, data mining, feature selection.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17541375 An Efficient Adaptive Thresholding Technique for Wavelet Based Image Denoising
Authors: D.Gnanadurai, V.Sadasivam
Abstract:
This frame work describes a computationally more efficient and adaptive threshold estimation method for image denoising in the wavelet domain based on Generalized Gaussian Distribution (GGD) modeling of subband coefficients. In this proposed method, the choice of the threshold estimation is carried out by analysing the statistical parameters of the wavelet subband coefficients like standard deviation, arithmetic mean and geometrical mean. The noisy image is first decomposed into many levels to obtain different frequency bands. Then soft thresholding method is used to remove the noisy coefficients, by fixing the optimum thresholding value by the proposed method. Experimental results on several test images by using this method show that this method yields significantly superior image quality and better Peak Signal to Noise Ratio (PSNR). Here, to prove the efficiency of this method in image denoising, we have compared this with various denoising methods like wiener filter, Average filter, VisuShrink and BayesShrink.Keywords: Wavelet Transform, Gaussian Noise, ImageDenoising, Filter Banks and Thresholding.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 29071374 On the Interactive Search with Web Documents
Authors: Mario Kubek, Herwig Unger
Abstract:
Due to the large amount of information in the World Wide Web (WWW, web) and the lengthy and usually linearly ordered result lists of web search engines that do not indicate semantic relationships between their entries, the search for topically similar and related documents can become a tedious task. Especially, the process of formulating queries with proper terms representing specific information needs requires much effort from the user. This problem gets even bigger when the user's knowledge on a subject and its technical terms is not sufficient enough to do so. This article presents the new and interactive search application DocAnalyser that addresses this problem by enabling users to find similar and related web documents based on automatic query formulation and state-ofthe- art search word extraction. Additionally, this tool can be used to track topics across semantically connected web documents.
Keywords: DocAnalyser, interactive web search, search word extraction, query formulation, source topic detection, topic tracking.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16481373 Using “Eckel” Model to Measure Income Smoothing Practices: The Case of French Companies
Authors: Feddaoui Amina
Abstract:
Income smoothing represents an attempt on the part of the company's management to reduce variations in earnings through the manipulation of the accounting principles. In this study, we aimed to measure income smoothing practices in a sample of 30 French joint stock companies during the period (2007-2009), we used Dummy variables method and “ECKEL” model to measure income smoothing practices and Binomial test accourding to SPSS program, to confirm or refute our hypothesis. This study concluded that there are no significant statistical indicators of income smoothing practices in the sample studied of French companies during the period (2007-2009), so the income series in the same sample studied of is characterized by stability and non-volatility without any intervention of management through accounting manipulation. However, this type of accounting manipulation should be taken into account and efforts should be made by control bodies to apply Eckel model and generalize its use at the global level.
Keywords: Income, smoothing, “Eckel”, French companies.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 10051372 A Linearization and Decomposition Based Approach to Minimize the Non-Productive Time in Transfer Lines
Authors: Hany Osman, M. F. Baki
Abstract:
We address the balancing problem of transfer lines in this paper to find the optimal line balancing that minimizes the nonproductive time. We focus on the tool change time and face orientation change time both of which influence the makespane. We consider machine capacity limitations and technological constraints associated with the manufacturing process of auto cylinder heads. The problem is represented by a mixed integer programming model that aims at distributing the design features to workstations and sequencing the machining processes at a minimum non-productive time. The proposed model is solved by an algorithm established using linearization schemes and Benders- decomposition approach. The experiments show the efficiency of the algorithm in reaching the exact solution of small and medium problem instances at reasonable time.Keywords: Transfer line balancing, Benders' decomposition, Linearization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 17311371 Influence of Maximum Fatigue Load on Probabilistic Aspect of Fatigue Crack Propagation Life at Specified Grown Crack in Magnesium Alloys
Authors: Seon Soon Choi
Abstract:
The principal purpose of this paper is to find the influence of maximum fatigue load on the probabilistic aspect of fatigue crack propagation life at a specified grown crack in magnesium alloys. The experiments of fatigue crack propagation are carried out in laboratory air under different conditions of the maximum fatigue loads to obtain the fatigue crack propagation data for the statistical analysis. In order to analyze the probabilistic aspect of fatigue crack propagation life, the goodness-of fit test for probability distribution of the fatigue crack propagation life at a specified grown crack is implemented through Anderson-Darling test. The good probability distribution of the fatigue crack propagation life is also verified under the conditions of the maximum fatigue loads.Keywords: Fatigue crack propagation life, magnesium alloys, maximum fatigue load, probability.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 9741370 A New Quantile Based Fuzzy Time Series Forecasting Model
Authors: Tahseen A. Jilani, Aqil S. Burney, C. Ardil
Abstract:
Time series models have been used to make predictions of academic enrollments, weather, road accident, casualties and stock prices, etc. Based on the concepts of quartile regression models, we have developed a simple time variant quantile based fuzzy time series forecasting method. The proposed method bases the forecast using prediction of future trend of the data. In place of actual quantiles of the data at each point, we have converted the statistical concept into fuzzy concept by using fuzzy quantiles using fuzzy membership function ensemble. We have given a fuzzy metric to use the trend forecast and calculate the future value. The proposed model is applied for TAIFEX forecasting. It is shown that proposed method work best as compared to other models when compared with respect to model complexity and forecasting accuracy.
Keywords: Quantile Regression, Fuzzy time series, fuzzy logicalrelationship groups, heuristic trend prediction.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19991369 Tuberculosis Modelling Using Bio-PEPA Approach
Authors: Dalila Hamami, Baghdad Atmani
Abstract:
Modelling is a widely used tool to facilitate the evaluation of disease management. The interest of epidemiological models lies in their ability to explore hypothetical scenarios and provide decision makers with evidence to anticipate the consequences of disease incursion and impact of intervention strategies.
All models are, by nature, simplification of more complex systems. Models that involve diseases can be classified into different categories depending on how they treat the variability, time, space, and structure of the population. Approaches may be different from simple deterministic mathematical models, to complex stochastic simulations spatially explicit.
Thus, epidemiological modelling is now a necessity for epidemiological investigations, surveillance, testing hypotheses and generating follow-up activities necessary to perform complete and appropriate analysis.
The state of the art presented in the following, allows us to position itself to the most appropriate approaches in the epidemiological study.
Keywords: Bio-PEPA, Cellular automata, Epidemiological modelling, multi agent system, ordinary differential equations, PEPA, Process Algebra, Tuberculosis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21581368 Identifying Network Subgraph-Associated Essential Genes in Molecular Networks
Authors: Efendi Zaenudin, Chien-Hung Huang, Ka-Lok Ng
Abstract:
Essential genes play an important role in the survival of an organism. It has been shown that cancer-associated essential genes are genes necessary for cancer cell proliferation, where these genes are potential therapeutic targets. Also, it was demonstrated that mutations of the cancer-associated essential genes give rise to the resistance of immunotherapy for patients with tumors. In the present study, we focus on studying the biological effects of the essential genes from a network perspective. We hypothesize that one can analyze a biological molecular network by decomposing it into both three-node and four-node digraphs (subgraphs). These network subgraphs encode the regulatory interaction information among the network’s genetic elements. In this study, the frequency of occurrence of the subgraph-associated essential genes in a molecular network was quantified by using the statistical parameter, odds ratio. Biological effects of subgraph-associated essential genes are discussed. In summary, the subgraph approach provides a systematic method for analyzing molecular networks and it can capture useful biological information for biomedical research.
Keywords: Biological molecular networks, essential genes, graph theory, network subgraphs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4951367 Roughness and Hardness of 60/40 Cu-Zn Alloy
Authors: Pavana Manvikar, G K Purohit
Abstract:
The functional performance of machined components, often, depends on surface topography, hardness, nature of stress and strain induced on the surface, etc. Invariably, surfaces of metallic components obtained by turning, milling, etc., consist of irregularities such as machining marks are responsible for the above. Surface finishing/coating processes used to produce improved surface quality/textures are classified as chip-removal and chip-less processes. Burnishing is chip-less cold working process carried out to improve surface finish, hardness and resistance to fatigue and corrosion; not obtainable by other surface coating and surface treatment processes. It is a very simple, but effective method which improves surface characteristics and is reported to introduce compressive stresses.
Of late, considerable attention is paid to post-machining, finishing operations, such as burnishing. During burnishing the micro-irregularities start to deform plastically, initially the crests are gradually flattened and zones of reduced deformation are formed. When all the crests are deformed, the valleys between the micro-irregularities start moving in the direction of the newly formed surface. The grain structure is then condensed, producing a smoother and harder surface with superior load-carrying and wear-resistant capabilities.
Burnishing can be performed on a lathe with a highly polished ball or roller type tool which is traversed under force over a rotating/stationary work piece. Often, several passes are used to obtain the work piece surface with the desired finish and hardness.
This paper presents the findings of an experimental investigation on the effect of ball burnishing parameters such as, burnishing speed, feed, force and number of passes; on surface roughness (Ra) and micro-hardness (Hv) of a 60/40 copper/zinc alloy, using a 2-level fractional factorial design of experiments (DoE). Mathematical models were developed to predict surface roughness and hardness generated by burnishing in terms of the above process parameters. A ball-type tool, designed and constructed from a high chrome steel material (HRC=63 and Ra=0.012 µm), was used for burnishing of fine-turned cylindrical bars (0.68-0.78µm and 145Hv). They are given by,
Ra= 0.305-0.005X1 - 0.0175X2 + 0.0525X4 + 0.0125X1X4 -0.02X2X4 - 0.0375X3X4
Hv=160.625 -2.37 5X1 + 5.125X2 + 1.875X3 + 4.375X4 - 1.625X1X4 + 4.375X2X4 - 2.375X3X4
High surface microhardness (175HV) was obtained at 400rpm, 2passes, 0.05mm/rev and 15kgf., and high surface finish (0.20µm) was achieved at 30kgf, 0.1mm/rev, 112rpm and single pass. In other words, surface finish improved by 350% and microhardness improved by 21% compared to as machined conditions.
Keywords: Ball burnishing, surface roughness, micro-hardness.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 25321366 Modeling and Simulations of Complex Low- Dimensional systems: Testing the Efficiency of Parallelization
Authors: Ryszard Matysiak, Grzegorz Kamieniarz
Abstract:
The deterministic quantum transfer-matrix (QTM) technique and its mathematical background are presented. This important tool in computational physics can be applied to a class of the real physical low-dimensional magnetic systems described by the Heisenberg hamiltonian which includes the macroscopic molecularbased spin chains, small size magnetic clusters embedded in some supramolecules and other interesting compounds. Using QTM, the spin degrees of freedom are accurately taken into account, yielding the thermodynamical functions at finite temperatures. In order to test the application for the susceptibility calculations to run in the parallel environment, the speed-up and efficiency of parallelization are analyzed on our platform SGI Origin 3800 with p = 128 processor units. Using Message Parallel Interface (MPI) system libraries we find the efficiency of the code of 94% for p = 128 that makes our application highly scalable.Keywords: Deterministic simulations, low-dimensional magnets, modeling of complex systems, parallelization.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16141365 Transformer Diagnosis Based on Coupled Circuits Method Modelling
Authors: Labar Hocine, Rekik Badri, Bounaya Kamel, Kelaiaia Mounia Samira
Abstract:
Diagnostic goal of transformers in service is to detect the winding or the core in fault. Transformers are valuable equipment which makes a major contribution to the supply security of a power system. Consequently, it is of great importance to minimize the frequency and duration of unwanted outages of power transformers. So, Frequency Response Analysis (FRA) is found to be a useful tool for reliable detection of incipient mechanical fault in a transformer, by finding winding or core defects. The authors propose as first part of this article, the coupled circuits method, because, it gives most possible exhaustive modelling of transformers. And as second part of this work, the application of FRA in low frequency in order to improve and simplify the response reading. This study can be useful as a base data for the other transformers of the same categories intended for distribution grid.
Keywords: Diagnostic, Coupled Circuit Method, FRA, Transformer Faults
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 15211364 Video-Based Tracking of Laparoscopic Instruments Using an Orthogonal Webcams System
Authors: Fernando Pérez, Humberto Sossa, Rigoberto Martínez, Daniel Lorias, Arturo Minor
Abstract:
This paper presents a system for tracking the movement of laparoscopic instruments which is based on an orthogonal system of webcams and video image processing. The movements are captured with two webcams placed orthogonally inside of the physical trainer. On the image, the instruments were detected by using color markers placed on the distal tip of each instrument. The 3D position of the tip of the instrument within the work space was obtained by linear triangulation method. Preliminary results showed linearity and repeatability in the motion tracking with a resolution of 0.616 mm in each axis; the accuracy of the system showed a 3D instrument positioning error of 1.009 ± 0.101 mm. This tool is a portable and low-cost alternative to traditional tracking devices and a trustable method for the objective evaluation of the surgeon’s surgical skills.
Keywords: Laparoscopic Surgery, Orthogonal Vision, Tracking Instruments, Triangulation.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 26441363 Finite Element Simulation of Deep Drawing Process to Minimize Earing
Authors: Pawan S. Nagda, Purnank S. Bhatt, Mit K. Shah
Abstract:
Earing defect in drawing process is highly undesirable not only because it adds on an additional trimming operation but also because the uneven material flow demands extra care. The objective of this work is to study the earing problem in the Deep Drawing of circular cup and to optimize the blank shape to reduce the earing. A finite element model is developed for 3-D numerical simulation of cup forming process in ABAQUS. Extra-deep-drawing (EDD) steel sheet has been used for simulation. Properties and tool design parameters were used as input for simulation. Earing was observed in the simulated cup and it was measured at various angles with respect to rolling direction. To reduce the earing defect initial blank shape was modified with the help of anisotropy coefficient. Modified blanks showed notable reduction in earing.Keywords: Finite element simulation, deep drawing, earing, anisotropy.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19931362 Comparative Study of Affricate Initial Consonants in Chinese and Slovak
Authors: Maria Istvanova
Abstract:
The purpose of the comparative study of the affricate consonants in Chinese and Slovak is to increase the awareness of the main distinguishing features between these two languages taking into consideration this particular group of consonants. We determine the main difficulties of the Slovak learners in the process of acquiring correct pronunciation of affricate initial consonants in Chinese based on the understanding of the distinguishing features of Chinese and Slovak affricates in combination with the experimental measuring of voice onset time (VOT) values. The software tool Praat is used for the analysis of the recorded language samples. The language samples contain recordings of a Chinese native speaker and Slovak students of Chinese with different language proficiency levels. Based on the results of the analysis in Praat, we identify erroneous pronunciation and provide clarification of its cause.
Keywords: Chinese, comparative study, initial consonants, pronunciation, Slovak
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 4751361 A Multi-Agent Simulation of Serious Games to Predict Their Impact on E-Learning Processes
Authors: Ibtissem Daoudi, Raoudha Chebil, Wided Lejouad Chaari
Abstract:
Serious games constitute actually a recent and attractive way supposed to replace the classical boring courses. However, the choice of the adapted serious game to a specific learning environment remains a challenging task that makes teachers unwilling to adopt this concept. To fill this gap, we present, in this paper, a multi-agent-based simulator allowing to predict the impact of a serious game integration in a learning environment given several game and players characteristics. As results, the presented tool gives intensities of several emotional aspects characterizing learners reactions to the serious game adoption. The presented simulator is tested to predict the effect of basing a coding course on the serious game ”CodeCombat”. The obtained results are compared with feedbacks of using the same serious game in a real learning process.Keywords: Emotion, learning process, multi-agent simulation, serious games.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 12861360 Performance of Phytogreen Zone for BOD5 and SS Removal for Refurbishment Conventional Oxidation Pond in an Integrated Phytogreen System
Authors: A. R. Abdul Syukor, A. W. Zularisam, Z. Ideris, M. S. Mohd Ismid, H. M. Nakmal, S. Sulaiman, A. H. Hasmanie, M. R. Siti Norsita, M. Nasrullah
Abstract:
In this study, the effectiveness of an integrated aquatic plants in phytogreen zone was studied and statistical analysis for the promotional integrated phytogreen system approached was discussed. It was found that's the effectiveness of using aquatic plant such as Typha angustifolia sp., Lepironia articulata sp., Limnocharis flava sp., Monochoria vaginalis sp., Pistia stratiotes sp., and Eichhornia crassipes sp., in the conventional oxidation pond process in order to comply the standard A according to Malaysia Environmental Quality Act 1974 (Act 127); Environmental Quality (Sewage) Regulation 2009 for effluent discharge into inland water near the residential area was successfully shown. It was concluded that the integrated phtogreen system developed in this study has great potential for refurbishment wastewater in conventional oxidation pond.
Keywords: Phytoremediation, integrated phytogreen system, sewage treatment plant, oxidation pond, aquatic plants.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 21721359 Hybrid Neural Network Methods for Lithology Identification in the Algerian Sahara
Authors: S. Chikhi, M. Batouche, H. Shout
Abstract:
In this paper, we combine a probabilistic neural method with radial-bias functions in order to construct the lithofacies of the wells DF01, DF02 and DF03 situated in the Triassic province of Algeria (Sahara). Lithofacies is a crucial problem in reservoir characterization. Our objective is to facilitate the experts' work in geological domain and to allow them to obtain quickly the structure and the nature of lands around the drilling. This study intends to design a tool that helps automatic deduction from numerical data. We used a probabilistic formalism to enhance the classification process initiated by a Self-Organized Map procedure. Our system gives lithofacies, from well-log data, of the concerned reservoir wells in an aspect easy to read by a geology expert who identifies the potential for oil production at a given source and so forms the basis for estimating the financial returns and economic benefits.
Keywords: Classification, Lithofacies, Probabilistic formalism, Reservoir characterization, Well-log data.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 18971358 ICCFMS - Enhancing a Competitive Advantage for Thailand’s IT Entrepreneurs
Authors: T. Niracharapa, W. Angkana
Abstract:
Since information and communication technology (ICT) plays a critical role in enhancing national competitiveness, it is a driving force for social and economic growth and prosperity. The ASEAN Economic Community (AEC) will integrate this into ASEAN countries as a new mechanism and a measure that will improve economic performance as a global economy. Government policies may support or impede such harmonization. This study was to investigate, analyze the status of Thai IT entrepreneurs and define key strategies to enhance their competitive advantage. Data were collected based on in-depth interviews, questionnaires, focus groups, seminars and fieldwork on information technology excluding communication. SWOT was used as a tool to analyze the study. The results of this study can be used to enable the government to guide policy, measures and strategies for creating a competitive advantage for Thailand’s IT entrepreneurs in the global market.
Keywords: AEC, ASEAN, competitive advantage, IT entrepreneurs.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 20841357 Tag Impersonation Attack on Ultra-Lightweight Radio Frequency Identification Authentication Scheme
Authors: Reham Al-Zahrani, Noura Aleisa
Abstract:
The proliferation of Radio Frequency Identification (RFID) technology has raised concerns about system security, particularly regarding tag impersonation attacks. Regarding RFID systems, an appropriate authentication protocol must resist active and passive attacks. A tag impersonation occurs when an adversary's tag is used to fool an authenticating reader into believing it is a legitimate tag. The paper thoroughly analyses the security of the Efficient, Secure, and Practical Ultra-Lightweight RFID Authentication Scheme (ESRAS). It examines the protocol within the context of RFID systems and focuses specifically on its vulnerability to tag impersonation attacks. The Scyther tool is utilized to assess the protocol's security, providing a comprehensive evaluation of ESRAS's effectiveness in preventing unauthorized tag impersonation.
Keywords: RFID, radio frequency identification, impersonation attack, authentication, ultra-lightweight protocols, security.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 871356 VISUAL JESS: AN Expandable Visual Generator of Oriented Object Expert systems
Authors: Amel Grissa-Touzi, Habib Ounally, Aissa Boulila
Abstract:
The utility of expert system generators has been widely recognized in many applications. Several generators based on concept of the paradigm object, have been recently proposed. The generator of oriented object expert system (GSEOO) offers languages that are often complex and difficult to use. We propose in this paper an extension of the expert system generator, JESS, which permits a friendly use of this expert system. The new tool, called VISUAL JESS, bring two main improvements to JESS. The first improvement concerns the easiness of its utilization while giving back transparency to the syntax and semantic aspects of the JESS programming language. The second improvement permits an easy access and modification of the JESS knowledge basis. The implementation of VISUAL JESS is made so that it is extensible and portable.Keywords: Generator of Systems Expert, Programming oriented object classifies, object, inheritance, polymorphism.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 16161355 Exchange Traded Products on the Warsaw Stock Exchange
Authors: Piotr Prewysz-Kwinto
Abstract:
A dynamic development of financial market is accompanied by the emergence of new products on stock exchanges which give absolutely new possibilities of investing money. Currently, the most innovative financial instruments offered to investors are exchange traded products (ETP). They can be defined as financial instruments whose price depends on the value of the underlying instrument. Thus, they offer investors a possibility of making a profit that results from the change in value of the underlying instrument without having to buy it. Currently, the Warsaw Stock Exchange offers many types of ETPs. They are investment products with full or partial capital protection, products without capital protection as well as leverage products, issued on such underlying instruments as indices, sector indices, commodity indices, prices of energy commodities, precious metals, agricultural produce or prices of shares of domestic and foreign companies. This paper presents the mechanism of functioning of ETP available on the Warsaw Stock Exchange and the results of the analysis of statistical data on these financial instruments.Keywords: Exchange traded products, financial market, investment, stock exchange.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 11751354 The Effects of Eight Weeks of Interval Endurance Training on hs-CRP Levels and Anthropometric Parameters in Overweight Men
Authors: S. Khoshemehry, M. J. Pourvaghar
Abstract:
Inflammatory markers are known as the main predictors of cardiovascular diseases. This study aimed at determining the effect of 8 weeks of interval endurance training on hs-CRP level and some anthropometric parameters in overweight men. Following the call for participation in research project in Kashan, 73 volunteers participated in it and constituted the statistical population of the study. Then, 28 overweight young men from the age of 22 to 25 years old were randomly assigned into two groups of experimental and control group (n=14). Anthropometric and the blood sample was collected before and after the termination of the program for measuring hs-CRP. The interval endurance program was performed at 60 to 75% of maximum heart rate in 2 sessions per week for 8 weeks. Kolmogorov-Smirnov test was used to test whether two samples come from the same distribution and T-test was used to assess the difference of two groups which were statistically significant at the level of 0.05. The result indicated that there was a significant difference between the hs-RP, weight, BMI and W/H ratio of overweight men in posttest in the exercise group (P≤0.05) but not in the control group. Interval endurance training program causes decrease in hs-CRP level and anthropometric parameters.Keywords: Interval endurance training program, hs-CRP, overweight, anthropometric.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 8161353 Coverage Probability Analysis of WiMAX Network under Additive White Gaussian Noise and Predicted Empirical Path Loss Model
Authors: Chaudhuri Manoj Kumar Swain, Susmita Das
Abstract:
This paper explores a detailed procedure of predicting a path loss (PL) model and its application in estimating the coverage probability in a WiMAX network. For this a hybrid approach is followed in predicting an empirical PL model of a 2.65 GHz WiMAX network deployed in a suburban environment. Data collection, statistical analysis, and regression analysis are the phases of operations incorporated in this approach and the importance of each of these phases has been discussed properly. The procedure of collecting data such as received signal strength indicator (RSSI) through experimental set up is demonstrated. From the collected data set, empirical PL and RSSI models are predicted with regression technique. Furthermore, with the aid of the predicted PL model, essential parameters such as PL exponent as well as the coverage probability of the network are evaluated. This research work may assist in the process of deployment and optimisation of any cellular network significantly.
Keywords: WiMAX, RSSI, path loss, coverage probability, regression analysis.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 7071352 Assessment of Hargreaves Equation for Estimating Monthly Reference Evapotranspiration in the South of Iran
Authors: Ali Dehgan Moroozeh, B. Farhadi Bansouleh
Abstract:
Evapotranspiration is one of the most important components of the hydrological cycle. Evapotranspiration (ETo) is an important variable in water and energy balances on the earth’s surface, and knowledge of the distribution of ET is a key factor in hydrology, climatology, agronomy and ecology studies. Many researchers have a valid relationship, which is a function of climate factors, to estimate the potential evapotranspiration presented to the plant water stress or water loss, prevent. The FAO-Penman method (PM) had been recommended as a standard method. This method requires many data and these data are not available in every area of world. So, other methods should be evaluated for these conditions. When sufficient or reliable data to solve the PM equation are not available then Hargreaves equation can be used. The Hargreaves equation (HG) requires only daily mean, maximum and minimum air temperature extraterrestrial radiation .In this study, Hargreaves method (HG) were evaluated in 12 stations in the North West region of Iran. Results of HG and M.HG methods were compared with results of PM method. Statistical analysis of this comparison showed that calibration process has had significant effect on efficiency of Hargreaves method.Keywords: Evapotranspiration, Hargreaves equation, FAOPenman method.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 19111351 Stating Best Commercialization Method: An Unanswered Question from Scholars and Practitioners
Authors: Saheed A. Gbadegeshin
Abstract:
Commercialization method is a means to make inventions available at the market for final consumption. It is described as an important tool for keeping business enterprises sustainable and improving national economic growth. Thus, there are several scholarly publications on it, either presenting or testing different methods for commercialization. However, young entrepreneurs, technologists and scientists would like to know the best method to commercialize their innovations. Then, this question arises: What is the best commercialization method? To answer the question, a systematic literature review was conducted, and practitioners were interviewed. The literary results revealed that there are many methods but new methods are needed to improve commercialization especially during these times of economic crisis and political uncertainty. Similarly, the empirical results showed there are several methods, but the best method is the one that reduces costs, reduces the risks associated with uncertainty, and improves customer participation and acceptability. Therefore, it was concluded that new commercialization method is essential for today's high technologies and a method was presented.
Keywords: Commercialization method, high technology, lean start-up methodology, technology, knowledge.
Procedia APA BibTeX Chicago EndNote Harvard JSON MLA RIS XML ISO 690 PDF Downloads 1303