Search results for: software development models
24048 A Study on Characteristics of Hedonic Price Models in Korea Based on Meta-Regression Analysis
Authors: Minseo Jo
Abstract:
The purpose of this paper is to examine the factors in the hedonic price models, that has significance impact in determining the price of apartments. There are many variables employed in the hedonic price models and their effectiveness vary differently according to the researchers and the regions they are analysing. In order to consider various conditions, the meta-regression analysis has been selected for the study. In this paper, four meta-independent variables, from the 65 hedonic price models to analysis. The factors that influence the prices of apartments, as well as including factors that influence the prices of apartments, regions, which are divided into two of the research performed, years of research performed, the coefficients of the functions employed. The covariance between the four meta-variables and p-value of the coefficients and the four meta-variables and number of data used in the 65 hedonic price models have been analyzed in this study. The six factors that are most important in deciding the prices of apartments are positioning of apartments, the noise of the apartments, points of the compass and views from the apartments, proximity to the public transportations, companies that have constructed the apartments, social environments (such as schools etc.).Keywords: hedonic price model, housing price, meta-regression analysis, characteristics
Procedia PDF Downloads 40224047 Features of Formation and Development of Possessory Risk Management Systems of Organization in the Russian Economy
Authors: Mikhail V. Khachaturyan, Inga A. Koryagina, Maria Nikishova
Abstract:
The study investigates the impact of the ongoing financial crisis, started in the 2nd half of 2014, on marketing budgets spent by Fast-moving consumer goods companies. In these conditions, special importance is given to efficient possessory risk management systems. The main objective for establishing and developing possessory risk management systems for FMCG companies in a crisis is to analyze the data relating to the external environment and consumer behavior in a crisis. Another important objective for possessory risk management systems of FMCG companies is to develop measures and mechanisms to maintain and stimulate sales. In this regard, analysis of risks and threats which consumers define as the main reasons affecting their level of consumption become important. It is obvious that in crisis conditions the effective risk management systems responsible for development and implementation of strategies for consumer demand stimulation, as well as the identification, analysis, assessment and management of other types of risks of economic security will be the key to sustainability of a company. In terms of financial and economic crisis, the problem of forming and developing possessory risk management systems becomes critical not only in the context of management models of FMCG companies, but for all the companies operating in other sectors of the Russian economy. This study attempts to analyze the specifics of formation and development of company possessory risk management systems. In the modern economy, special importance among all the types of owner’s risks has the risk of reduction in consumer activity. This type of risk is common not only for the consumer goods trade. Study of consumer activity decline is especially important for Russia due to domestic market of consumer goods being still in the development stage, despite its significant growth. In this regard, it is especially important to form and develop possessory risk management systems for FMCG companies. The authors offer their own interpretation of the process of forming and developing possessory risk management systems within owner’s management models of FMCG companies as well as in Russian economy in general. Proposed methods and mechanisms of problem analysis of formation and development of possessory risk management systems in FMCG companies and the results received can be helpful for researchers interested in problems of consumer goods market development in Russia and overseas.Keywords: FMCG companies, marketing budget, risk management, owner, Russian economy, organization, formation, development, system
Procedia PDF Downloads 37724046 Pressure Gradient Prediction of Oil-Water Two Phase Flow through Horizontal Pipe
Authors: Ahmed I. Raheem
Abstract:
In this thesis, stratified and stratified wavy flow regimes have been investigated numerically for the oil (1.57 mPa s viscosity and 780 kg/m3 density) and water twophase flow in small and large horizontal steel pipes with a diameter between 0.0254 to 0.508 m by ANSYS Fluent software. Volume of fluid (VOF) with two phases flows using two equations family models (Realizable k-Keywords: CFD, two-phase flow, pressure gradient, volume of fluid, large diameter, horizontal pipe, oil-water stratified and stratified wavy flow
Procedia PDF Downloads 43324045 Exploring Factors Affecting Electricity Production in Malaysia
Authors: Endang Jati Mat Sahid, Hussain Ali Bekhet
Abstract:
Ability to supply reliable and secure electricity has been one of the crucial components of economic development for any country. Forecasting of electricity production is therefore very important for accurate investment planning of generation power plants. In this study, we aim to examine and analyze the factors that affect electricity generation. Multiple regression models were used to find the relationship between various variables and electricity production. The models will simultaneously determine the effects of the variables on electricity generation. Many variables influencing electricity generation, i.e. natural gas (NG), coal (CO), fuel oil (FO), renewable energy (RE), gross domestic product (GDP) and fuel prices (FP), were examined for Malaysia. The results demonstrate that NG, CO, and FO were the main factors influencing electricity generation growth. This study then identified a number of policy implications resulting from the empirical results.Keywords: energy policy, energy security, electricity production, Malaysia, the regression model
Procedia PDF Downloads 16424044 Convolutional Neural Networks Architecture Analysis for Image Captioning
Authors: Jun Seung Woo, Shin Dong Ho
Abstract:
The Image Captioning models with Attention technology have developed significantly compared to previous models, but it is still unsatisfactory in recognizing images. We perform an extensive search over seven interesting Convolutional Neural Networks(CNN) architectures to analyze the behavior of different models for image captioning. We compared seven different CNN Architectures, according to batch size, using on public benchmarks: MS-COCO datasets. In our experimental results, DenseNet and InceptionV3 got about 14% loss and about 160sec training time per epoch. It was the most satisfactory result among the seven CNN architectures after training 50 epochs on GPU.Keywords: deep learning, image captioning, CNN architectures, densenet, inceptionV3
Procedia PDF Downloads 13324043 Models and Metamodels for Computer-Assisted Natural Language Grammar Learning
Authors: Evgeny Pyshkin, Maxim Mozgovoy, Vladislav Volkov
Abstract:
The paper follows a discourse on computer-assisted language learning. We examine problems of foreign language teaching and learning and introduce a metamodel that can be used to define learning models of language grammar structures in order to support teacher/student interaction. Special attention is paid to the concept of a virtual language lab. Our approach to language education assumes to encourage learners to experiment with a language and to learn by discovering patterns of grammatically correct structures created and managed by a language expert.Keywords: computer-assisted instruction, language learning, natural language grammar models, HCI
Procedia PDF Downloads 51924042 Discover a New Technique for Cancer Recognition by Analysis and Determination of Fractal Dimension Images in Matlab Software
Authors: Saeedeh Shahbazkhany
Abstract:
Cancer is a terrible disease that, if not diagnosed early, therapy can be difficult while it is easily medicable if it is diagnosed in early stages. So it is very important for cancer diagnosis that medical procedures are performed. In this paper we introduce a new method. In this method, we only need pictures of healthy cells and cancer cells. In fact, where we suspect cancer, we take a picture of cells or tissue in that area, and then take some pictures of the surrounding tissues. Then, fractal dimension of images are calculated and compared. Cancer can be easily detected by comparing the fractal dimension of images. In this method, we use Matlab software.Keywords: Matlab software, fractal dimension, cancer, surrounding tissues, cells or tissue, new method
Procedia PDF Downloads 35424041 Automatic Calibration of Agent-Based Models Using Deep Neural Networks
Authors: Sima Najafzadehkhoei, George Vega Yon
Abstract:
This paper presents an approach for calibrating Agent-Based Models (ABMs) efficiently, utilizing Convolutional Neural Networks (CNNs) and Long Short-Term Memory (LSTM) networks. These machine learning techniques are applied to Susceptible-Infected-Recovered (SIR) models, which are a core framework in the study of epidemiology. Our method replicates parameter values from observed trajectory curves, enhancing the accuracy of predictions when compared to traditional calibration techniques. Through the use of simulated data, we train the models to predict epidemiological parameters more accurately. Two primary approaches were explored: one where the number of susceptible, infected, and recovered individuals is fully known, and another using only the number of infected individuals. Our method shows promise for application in other ABMs where calibration is computationally intensive and expensive.Keywords: ABM, calibration, CNN, LSTM, epidemiology
Procedia PDF Downloads 2424040 A Parallel Cellular Automaton Model of Tumor Growth for Multicore and GPU Programming
Authors: Manuel I. Capel, Antonio Tomeu, Alberto Salguero
Abstract:
Tumor growth from a transformed cancer-cell up to a clinically apparent mass spans through a range of spatial and temporal magnitudes. Through computer simulations, Cellular Automata (CA) can accurately describe the complexity of the development of tumors. Tumor development prognosis can now be made -without making patients undergo through annoying medical examinations or painful invasive procedures- if we develop appropriate CA-based software tools. In silico testing mainly refers to Computational Biology research studies of application to clinical actions in Medicine. To establish sound computer-based models of cellular behavior, certainly reduces costs and saves precious time with respect to carrying out experiments in vitro at labs or in vivo with living cells and organisms. These aim to produce scientifically relevant results compared to traditional in vitro testing, which is slow, expensive, and does not generally have acceptable reproducibility under the same conditions. For speeding up computer simulations of cellular models, specific literature shows recent proposals based on the CA approach that include advanced techniques, such the clever use of supporting efficient data structures when modeling with deterministic stochastic cellular automata. Multiparadigm and multiscale simulation of tumor dynamics is just beginning to be developed by the concerned research community. The use of stochastic cellular automata (SCA), whose parallel programming implementations are open to yield a high computational performance, are of much interest to be explored up to their computational limits. There have been some approaches based on optimizations to advance in multiparadigm models of tumor growth, which mainly pursuit to improve performance of these models through efficient memory accesses guarantee, or considering the dynamic evolution of the memory space (grids, trees,…) that holds crucial data in simulations. In our opinion, the different optimizations mentioned above are not decisive enough to achieve the high performance computing power that cell-behavior simulation programs actually need. The possibility of using multicore and GPU parallelism as a promising multiplatform and framework to develop new programming techniques to speed-up the computation time of simulations is just starting to be explored in the few last years. This paper presents a model that incorporates parallel processing, identifying the synchronization necessary for speeding up tumor growth simulations implemented in Java and C++ programming environments. The speed up improvement that specific parallel syntactic constructs, such as executors (thread pools) in Java, are studied. The new tumor growth parallel model is proved using implementations with Java and C++ languages on two different platforms: chipset Intel core i-X and a HPC cluster of processors at our university. The parallelization of Polesczuk and Enderling model (normally used by researchers in mathematical oncology) proposed here is analyzed with respect to performance gain. We intend to apply the model and overall parallelization technique presented here to solid tumors of specific affiliation such as prostate, breast, or colon. Our final objective is to set up a multiparadigm model capable of modelling angiogenesis, or the growth inhibition induced by chemotaxis, as well as the effect of therapies based on the presence of cytotoxic/cytostatic drugs.Keywords: cellular automaton, tumor growth model, simulation, multicore and manycore programming, parallel programming, high performance computing, speed up
Procedia PDF Downloads 24424039 Brain Tumor Detection and Classification Using Pre-Trained Deep Learning Models
Authors: Aditya Karade, Sharada Falane, Dhananjay Deshmukh, Vijaykumar Mantri
Abstract:
Brain tumors pose a significant challenge in healthcare due to their complex nature and impact on patient outcomes. The application of deep learning (DL) algorithms in medical imaging have shown promise in accurate and efficient brain tumour detection. This paper explores the performance of various pre-trained DL models ResNet50, Xception, InceptionV3, EfficientNetB0, DenseNet121, NASNetMobile, VGG19, VGG16, and MobileNet on a brain tumour dataset sourced from Figshare. The dataset consists of MRI scans categorizing different types of brain tumours, including meningioma, pituitary, glioma, and no tumour. The study involves a comprehensive evaluation of these models’ accuracy and effectiveness in classifying brain tumour images. Data preprocessing, augmentation, and finetuning techniques are employed to optimize model performance. Among the evaluated deep learning models for brain tumour detection, ResNet50 emerges as the top performer with an accuracy of 98.86%. Following closely is Xception, exhibiting a strong accuracy of 97.33%. These models showcase robust capabilities in accurately classifying brain tumour images. On the other end of the spectrum, VGG16 trails with the lowest accuracy at 89.02%.Keywords: brain tumour, MRI image, detecting and classifying tumour, pre-trained models, transfer learning, image segmentation, data augmentation
Procedia PDF Downloads 7424038 The Confounding Role of Graft-versus-Host Disease in Animal Models of Cancer Immunotherapy: A Systematic Review
Authors: Hami Ashraf, Mohammad Heydarnejad
Abstract:
Introduction: The landscape of cancer treatment has been revolutionized by immunotherapy, offering novel therapeutic avenues for diverse cancer types. Animal models play a pivotal role in the development and elucidation of these therapeutic modalities. Nevertheless, the manifestation of Graft-versus-Host Disease (GVHD) in such models poses significant challenges, muddling the interpretation of experimental data within the ambit of cancer immunotherapy. This study is dedicated to scrutinizing the role of GVHD as a confounding factor in animal models used for cancer immunotherapy, alongside proposing viable strategies to mitigate this complication. Method: Employing a systematic review framework, this study undertakes a comprehensive literature survey including academic journals in PubMed, Embase, and Web of Science databases and conference proceedings to collate pertinent research that delves into the impact of GVHD on animal models in cancer immunotherapy. The acquired studies undergo rigorous analysis and synthesis, aiming to assess the influence of GVHD on experimental results while identifying strategies to alleviate its confounding effects. Results: Findings indicate that GVHD incidence significantly skews the reliability and applicability of experimental outcomes, occasionally leading to erroneous interpretations. The literature surveyed also sheds light on various methodologies under exploration to counteract the GVHD dilemma, thereby bolstering the experimental integrity in this domain. Conclusion: GVHD's presence critically affects both the interpretation and validity of experimental findings, underscoring the imperative for strategies to curtail its confounding impacts. Current research endeavors are oriented towards devising solutions to this issue, aiming to augment the dependability and pertinence of experimental results. It is incumbent upon researchers to diligently consider and adjust for GVHD's effects, thereby enhancing the translational potential of animal model findings to clinical applications and propelling progress in the arena of cancer immunotherapy.Keywords: graft-versus-host disease, cancer immunotherapy, animal models, preclinical model
Procedia PDF Downloads 5124037 The DAQ Debugger for iFDAQ of the COMPASS Experiment
Authors: Y. Bai, M. Bodlak, V. Frolov, S. Huber, V. Jary, I. Konorov, D. Levit, J. Novy, D. Steffen, O. Subrt, M. Virius
Abstract:
In general, state-of-the-art Data Acquisition Systems (DAQ) in high energy physics experiments must satisfy high requirements in terms of reliability, efficiency and data rate capability. This paper presents the development and deployment of a debugging tool named DAQ Debugger for the intelligent, FPGA-based Data Acquisition System (iFDAQ) of the COMPASS experiment at CERN. Utilizing a hardware event builder, the iFDAQ is designed to be able to readout data at the average maximum rate of 1.5 GB/s of the experiment. In complex softwares, such as the iFDAQ, having thousands of lines of code, the debugging process is absolutely essential to reveal all software issues. Unfortunately, conventional debugging of the iFDAQ is not possible during the real data taking. The DAQ Debugger is a tool for identifying a problem, isolating the source of the problem, and then either correcting the problem or determining a way to work around it. It provides the layer for an easy integration to any process and has no impact on the process performance. Based on handling of system signals, the DAQ Debugger represents an alternative to conventional debuggers provided by most integrated development environments. Whenever problem occurs, it generates reports containing all necessary information important for a deeper investigation and analysis. The DAQ Debugger was fully incorporated to all processes in the iFDAQ during the run 2016. It helped to reveal remaining software issues and improved significantly the stability of the system in comparison with the previous run. In the paper, we present the DAQ Debugger from several insights and discuss it in a detailed way.Keywords: DAQ Debugger, data acquisition system, FPGA, system signals, Qt framework
Procedia PDF Downloads 28424036 Impact of the Hayne Royal Commission on the Operating Model of Australian Financial Advice Firms
Authors: Mohammad Abu-Taleb
Abstract:
The final report of the Royal Commission into Australian financial services misconduct, released in February 2019, has had a significant impact on the financial advice industry. The recommendations released in the Commissioner’s final report include changes to ongoing fee arrangements, a new disciplinary system for financial advisers, and mandatory reporting of compliance concerns. This thesis aims to explore the impact of the Royal Commission’s recommendations on the operating model of financial advice firms in terms of advice products, processes, delivery models, and customer segments. Also, this research seeks to investigate whether the Royal Commission’s outcome has accelerated the use of enhanced technology solutions within the operating model of financial advice firms. And to identify the key challenges confronting financial advice firms whilst implementing the Commissioner’s recommendations across their operating models. In order to achieve the objectives of this thesis, a qualitative research design has been adopted through semi-structured in-depth interviews with 24 financial advisers and managers who are engaged in the operation of financial advice services. The study used the thematic analysis approach to interpret the qualitative data collected from the interviews. The findings of this thesis reveal that customer-centric operating models will become more prominent across the financial advice industry in response to the Commissioner’s final report. And the Royal Commission’s outcome has accelerated the use of advice technology solutions within the operating model of financial advice firms. In addition, financial advice firms have started more than before using simpler and more automated web-based advice services, which enable financial advisers to provide simple advice in a greater scale, and also to accelerate the use of robo-advice models and digital delivery to mass customers in the long term. Furthermore, the study identifies process and technology changes as, long with technical and interpersonal skills development, as the key challenges encountered financial advice firms whilst implementing the Commissioner’s recommendations across their operating models.Keywords: hayne royal commission, financial planning advice, operating model, advice products, advice processes, delivery models, customer segments, digital advice solutions
Procedia PDF Downloads 8824035 Designing the Maturity Model of Smart Digital Transformation through the Foundation Data Method
Authors: Mohammad Reza Fazeli
Abstract:
Nowadays, the fourth industry, known as the digital transformation of industries, is seen as one of the top subjects in the history of structural revolution, which has led to the high-tech and tactical dominance of the organization. In the face of these profits, the undefined and non-transparent nature of the after-effects of investing in digital transformation has hindered many organizations from attempting this area of this industry. One of the important frameworks in the field of understanding digital transformation in all organizations is the maturity model of digital transformation. This model includes two main parts of digital transformation maturity dimensions and digital transformation maturity stages. Mediating factors of digital maturity and organizational performance at the individual (e.g., motivations, attitudes) and at the organizational level (e.g., organizational culture) should be considered. For successful technology adoption processes, organizational development and human resources must go hand in hand and be supported by a sound communication strategy. Maturity models are developed to help organizations by providing broad guidance and a roadmap for improvement. However, as a result of a systematic review of the literature and its analysis, it was observed that none of the 18 maturity models in the field of digital transformation fully meet all the criteria of appropriateness, completeness, clarity, and objectivity. A maturity assessment framework potentially helps systematize assessment processes that create opportunities for change in processes and organizations enabled by digital initiatives and long-term improvements at the project portfolio level. Cultural characteristics reflecting digital culture are not systematically integrated, and specific digital maturity models for the service sector are less clearly presented. It is also clearly evident that research on the maturity of digital transformation as a holistic concept is scarce and needs more attention in future research.Keywords: digital transformation, organizational performance, maturity models, maturity assessment
Procedia PDF Downloads 10724034 Empirical Study From Final Exams of Graduate Courses in Computer Science to Demystify the Notion of an Average Software Engineer and Offer a Direction to Address Diversity of Professional Backgrounds of a Student Body
Authors: Alex Elentukh
Abstract:
The paper is based on data collected from final exams administered during five years of teaching the graduate course in software engineering. The visualization instrument with four distinct personas has been used to improve the effectiveness of each class. The study offers a plethora of clues toward students' behavioral preferences. Diversity among students (professional background, physical proximity) is too significant to assume a single face of a learner. This is particularly true for a body of online graduate students in computer science. Conclusions of the study (each learner is unique, and each class is unique) are extrapolated to demystify the notion of an 'average software engineer.' An immediate direction for an educator is to ensure a course applies to a wide audience of very different individuals. On the other hand, a student should be clear about his/her abilities and preferences - to follow the most effective learning path.Keywords: K.3.2 computer and information science education, learner profiling, adaptive learning, software engineering
Procedia PDF Downloads 10324033 Continuous Improvement of Teaching Quality through Course Evaluation by the Students
Authors: Valerie Follonier, Henrike Hamelmann, Jean-Michel Jullien
Abstract:
The Distance Learning University in Switzerland (UniDistance) is offering bachelor and master courses as well as further education programs. The professors and their assistants work at traditional Swiss universities and are giving their courses at UniDistance following a blended learning and flipped classroom approach. A standardized course evaluation by the students has been established as a component of a quality improvement process. The students’ feedback enables the stakeholders to identify areas of improvement, initiate professional development for the teaching teams and thus continuously augment the quality of instruction. This paper describes the evaluation process, the tools involved and how the approach involving all stakeholders helps forming a culture of quality in teaching. Additionally, it will present the first evaluation results following the new process. Two software tools have been developed to support all stakeholders in the process of the semi-annual formative evaluation. The first tool allows to create the survey and to assign it to the relevant courses and students. The second tool presents the results of the evaluation to the stakeholders, providing specific features for the teaching teams, the dean, the directorate and EDUDL+ (Educational development unit distance learning). The survey items were selected in accordance with the e-learning strategy of the institution and are formulated to support the professional development of the teaching teams. By reviewing the results the teaching teams become aware of the opinion of the students and are asked to write a feedback for the attention of their dean. The dean reviews the results of the faculty and writes a general report about the situation of the faculty and the possible improvements intended. Finally, EDUDL+ writes a final report summarising the evaluation results. A mechanism of adjustable warnings allows it to generate quality indicators for each module. These are summarised for each faculty and globally for the whole institution in order to increase the vigilance of the responsible. The quality process involves changing the indicators regularly to focus on different areas each semester, to facilitate the professional development of the teaching teams and to progressively augment the overall teaching quality of the institution.Keywords: continuous improvement process, course evaluation, distance learning, software tools, teaching quality
Procedia PDF Downloads 25924032 The Use of Network Tool for Brain Signal Data Analysis: A Case Study with Blind and Sighted Individuals
Authors: Cleiton Pons Ferreira, Diana Francisca Adamatti
Abstract:
Advancements in computers technology have allowed to obtain information for research in biology and neuroscience. In order to transform the data from these surveys, networks have long been used to represent important biological processes, changing the use of this tools from purely illustrative and didactic to more analytic, even including interaction analysis and hypothesis formulation. Many studies have involved this application, but not directly for interpretation of data obtained from brain functions, asking for new perspectives of development in neuroinformatics using existent models of tools already disseminated by the bioinformatics. This study includes an analysis of neurological data through electroencephalogram (EEG) signals, using the Cytoscape, an open source software tool for visualizing complex networks in biological databases. The data were obtained from a comparative case study developed in a research from the University of Rio Grande (FURG), using the EEG signals from a Brain Computer Interface (BCI) with 32 eletrodes prepared in the brain of a blind and a sighted individuals during the execution of an activity that stimulated the spatial ability. This study intends to present results that lead to better ways for use and adapt techniques that support the data treatment of brain signals for elevate the understanding and learning in neuroscience.Keywords: neuroinformatics, bioinformatics, network tools, brain mapping
Procedia PDF Downloads 18224031 Logical-Probabilistic Modeling of the Reliability of Complex Systems
Authors: Sergo Tsiramua, Sulkhan Sulkhanishvili, Elisabed Asabashvili, Lazare Kvirtia
Abstract:
The paper presents logical-probabilistic methods, models and algorithms for reliability assessment of complex systems, based on which a web application for structural analysis and reliability assessment of systems was created. The reliability assessment process included the following stages, which were reflected in the application: 1) Construction of a graphical scheme of the structural reliability of the system; 2) Transformation of the graphic scheme into a logical representation and modeling of the shortest ways of successful functioning of the system; 3) Description of system operability condition with logical function in the form of disjunctive normal form (DNF); 4) Transformation of DNF into orthogonal disjunction normal form (ODNF) using the orthogonalization algorithm; 5) Replacing logical elements with probabilistic elements in ODNF, obtaining a reliability estimation polynomial and quantifying reliability; 6) Calculation of weights of elements. Using the logical-probabilistic methods, models and algorithms discussed in the paper, a special software was created, by means of which a quantitative assessment of the reliability of systems of a complex structure is produced. As a result, structural analysis of systems, research and designing of optimal structure systems are carried out.Keywords: Complex systems, logical-probabilistic methods, orthogonalization algorithm, reliability, weight of element
Procedia PDF Downloads 7324030 Development of a Wind Resource Assessment Framework Using Weather Research and Forecasting (WRF) Model, Python Scripting and Geographic Information Systems
Authors: Jerome T. Tolentino, Ma. Victoria Rejuso, Jara Kaye Villanueva, Loureal Camille Inocencio, Ma. Rosario Concepcion O. Ang
Abstract:
Wind energy is rapidly emerging as the primary source of electricity in the Philippines, although developing an accurate wind resource model is difficult. In this study, Weather Research and Forecasting (WRF) Model, an open source mesoscale Numerical Weather Prediction (NWP) model, was used to produce a 1-year atmospheric simulation with 4 km resolution on the Ilocos Region of the Philippines. The WRF output (netCDF) extracts the annual mean wind speed data using a Python-based Graphical User Interface. Lastly, wind resource assessment was produced using a GIS software. Results of the study showed that it is more flexible to use Python scripts than using other post-processing tools in dealing with netCDF files. Using WRF Model, Python, and Geographic Information Systems, a reliable wind resource map is produced.Keywords: wind resource assessment, weather research and forecasting (WRF) model, python, GIS software
Procedia PDF Downloads 44224029 Performance Demonstration of Extendable NSPO Space-Borne GPS Receiver
Authors: Hung-Yuan Chang, Wen-Lung Chiang, Kuo-Liang Wu, Chen-Tsung Lin
Abstract:
National Space Organization (NSPO) has completed in 2014 the development of a space-borne GPS receiver, including design, manufacture, comprehensive functional test, environmental qualification test and so on. The main performance of this receiver include 8-meter positioning accuracy, 0.05 m/sec speed-accuracy, the longest 90 seconds of cold start time, and up to 15g high dynamic scenario. The receiver will be integrated in the autonomous FORMOSAT-7 NSPO-Built satellite scheduled to be launched in 2019 to execute pre-defined scientific missions. The flight model of this receiver manufactured in early 2015 will pass comprehensive functional tests and environmental acceptance tests, etc., which are expected to be completed by the end of 2015. The space-borne GPS receiver is a pure software design in which all GPS baseband signal processing are executed by a digital signal processor (DSP), currently only 50% of its throughput being used. In response to the booming global navigation satellite systems, NSPO will gradually expand this receiver to become a multi-mode, multi-band, high-precision navigation receiver, and even a science payload, such as the reflectometry receiver of a global navigation satellite system. The fundamental purpose of this extension study is to port some software algorithms such as signal acquisition and correlation, reused code and large amount of computation load to the FPGA whose processor is responsible for operational control, navigation solution, and orbit propagation and so on. Due to the development and evolution of the FPGA is pretty fast, the new system architecture upgraded via an FPGA should be able to achieve the goal of being a multi-mode, multi-band high-precision navigation receiver, or scientific receiver. Finally, the results of tests show that the new system architecture not only retains the original overall performance, but also sets aside more resources available for future expansion possibility. This paper will explain the detailed DSP/FPGA architecture, development, test results, and the goals of next development stage of this receiver.Keywords: space-borne, GPS receiver, DSP, FPGA, multi-mode multi-band
Procedia PDF Downloads 36924028 The Planner's Pentangle: A Proposal for a 21st-Century Model of Planning for Sustainable Development
Authors: Sonia Hirt
Abstract:
The Planner's Triangle, an oft-cited model that visually defined planning as the search for sustainability to balance the three basic priorities of equity, economy, and environment, has influenced planning theory and practice for a quarter of a century. In this essay, we argue that the triangle requires updating and expansion. Even if planners keep sustainability as their key core aspiration at the center of their imaginary geometry, the triangle's vertices have to be rethought. Planners should move on to a 21st-century concept. We propose a Planner's Pentangle with five basic priorities as vertices of a new conceptual polygon. These five priorities are Wellbeing, Equity, Economy, Environment, and Esthetics (WE⁴). The WE⁴ concept more accurately and fully represents planning’s history. This is especially true in the United States, where public art and public health played pivotal roles in the establishment of the profession in the late 19th and early 20th centuries. It also more accurately represents planning’s future. Both health/wellness and aesthetic concerns are becoming increasingly important in the 21st century. The pentangle can become an effective tool for understanding and visualizing planning's history and present. Planning has a long history of representing urban presents and future as conceptual models in visual form. Such models can play an important role in understanding and shaping practice. For over two decades, one such model, the Planner's Triangle, stood apart as the expression of planning's pursuit for sustainability. But if the model is outdated and insufficiently robust, it can diminish our understanding of planning practice, as well as the appreciation of the profession among non-planners. Thus, we argue for a new conceptual model of what planners do.Keywords: sustainable development, planning for sustainable development, planner's triangle, planner's pentangle, planning and health, planning and art, planning history
Procedia PDF Downloads 14124027 Low Cost LiDAR-GNSS-UAV Technology Development for PT Garam’s Three Dimensional Stockpile Modeling Needs
Authors: Mohkammad Nur Cahyadi, Imam Wahyu Farid, Ronny Mardianto, Agung Budi Cahyono, Eko Yuli Handoko, Daud Wahyu Imani, Arizal Bawazir, Luki Adi Triawan
Abstract:
Unmanned aerial vehicle (UAV) technology has cost efficiency and data retrieval time advantages. Using technologies such as UAV, GNSS, and LiDAR will later be combined into one of the newest technologies to cover each other's deficiencies. This integration system aims to increase the accuracy of calculating the volume of the land stockpile of PT. Garam (Salt Company). The use of UAV applications to obtain geometric data and capture textures that characterize the structure of objects. This study uses the Taror 650 Iron Man drone with four propellers, which can fly for 15 minutes. LiDAR can classify based on the number of image acquisitions processed in the software, utilizing photogrammetry and structural science principles from Motion point cloud technology. LiDAR can perform data acquisition that enables the creation of point clouds, three-dimensional models, Digital Surface Models, Contours, and orthomosaics with high accuracy. LiDAR has a drawback in the form of coordinate data positions that have local references. Therefore, researchers use GNSS, LiDAR, and drone multi-sensor technology to map the stockpile of salt on open land and warehouses every year, carried out by PT. Garam twice, where the previous process used terrestrial methods and manual calculations with sacks. Research with LiDAR needs to be combined with UAV to overcome data acquisition limitations because it only passes through the right and left sides of the object, mainly when applied to a salt stockpile. The UAV is flown to assist data acquisition with a wide coverage with the help of integration of the 200-gram LiDAR system so that the flying angle taken can be optimal during the flight process. Using LiDAR for low-cost mapping surveys will make it easier for surveyors and academics to obtain pretty accurate data at a more economical price. As a survey tool, LiDAR is included in a tool with a low price, around 999 USD; this device can produce detailed data. Therefore, to minimize the operational costs of using LiDAR, surveyors can use Low-Cost LiDAR, GNSS, and UAV at a price of around 638 USD. The data generated by this sensor is in the form of a visualization of an object shape made in three dimensions. This study aims to combine Low-Cost GPS measurements with Low-Cost LiDAR, which are processed using free user software. GPS Low Cost generates data in the form of position-determining latitude and longitude coordinates. The data generates X, Y, and Z values to help georeferencing process the detected object. This research will also produce LiDAR, which can detect objects, including the height of the entire environment in that location. The results of the data obtained are calibrated with pitch, roll, and yaw to get the vertical height of the existing contours. This study conducted an experimental process on the roof of a building with a radius of approximately 30 meters.Keywords: LiDAR, unmanned aerial vehicle, low-cost GNSS, contour
Procedia PDF Downloads 9524026 The School Based Support Program: An Evaluation of a Comprehensive School Reform Initiative in the State of Qatar
Authors: Abdullah Abu-Tineh, Youmen Chaaban
Abstract:
This study examines the development of a professional development (PD) model for teacher growth and learning that is embedded into the school context. The School based Support Program (SBSP), designed for the Qatari context, targets the practices, knowledge and skills of both school leadership and teachers in an attempt to improve student learning outcomes. Key aspects of the model include the development of learning communities among teachers, strong leadership that supports school improvement activities, and the use of research-based PD to improve teacher practices and student achievement. This paper further presents findings from an evaluation of this PD program. Based on an adaptation of Guskey’s evaluation of PD models, 100 teachers at the participating schools were selected for classroom observations and 40 took part in in-depth interviews to examine changed classroom practices. The impact of the PD program on student learning was also examined. Teachers’ practices and their students’ achievement in English, Arabic, mathematics and science were measured at the beginning and at the end of the intervention.Keywords: initiative, professional development, school based support Program (SBSP), school reform
Procedia PDF Downloads 49624025 Induction Heating Process Design Using Comsol® Multiphysics Software Version 4.2a
Authors: K. Djellabi, M. E. H. Latreche
Abstract:
Induction heating computer simulation is a powerful tool for process design and optimization, induction coil design, equipment selection, as well as education and business presentations. The authors share their vast experience in the practical use of computer simulation for different induction heating and heat treating processes. In this paper deals with mathematical modeling and numerical simulation of induction heating furnaces with axisymmetric geometries. For the numerical solution, we propose finite element methods combined with boundary (FEM) for the electromagnetic model using COMSOL® Multiphysics Software. Some numerical results for an industrial furnace are shown with high frequency.Keywords: numerical methods, induction furnaces, induction heating, finite element method, Comsol multiphysics software
Procedia PDF Downloads 45024024 Evaluation and Compression of Different Language Transformer Models for Semantic Textual Similarity Binary Task Using Minority Language Resources
Authors: Ma. Gracia Corazon Cayanan, Kai Yuen Cheong, Li Sha
Abstract:
Training a language model for a minority language has been a challenging task. The lack of available corpora to train and fine-tune state-of-the-art language models is still a challenge in the area of Natural Language Processing (NLP). Moreover, the need for high computational resources and bulk data limit the attainment of this task. In this paper, we presented the following contributions: (1) we introduce and used a translation pair set of Tagalog and English (TL-EN) in pre-training a language model to a minority language resource; (2) we fine-tuned and evaluated top-ranking and pre-trained semantic textual similarity binary task (STSB) models, to both TL-EN and STS dataset pairs. (3) then, we reduced the size of the model to offset the need for high computational resources. Based on our results, the models that were pre-trained to translation pairs and STS pairs can perform well for STSB task. Also, having it reduced to a smaller dimension has no negative effect on the performance but rather has a notable increase on the similarity scores. Moreover, models that were pre-trained to a similar dataset have a tremendous effect on the model’s performance scores.Keywords: semantic matching, semantic textual similarity binary task, low resource minority language, fine-tuning, dimension reduction, transformer models
Procedia PDF Downloads 21124023 A Comparative Analysis of ARIMA and Threshold Autoregressive Models on Exchange Rate
Authors: Diteboho Xaba, Kolentino Mpeta, Tlotliso Qejoe
Abstract:
This paper assesses the in-sample forecasting of the South African exchange rates comparing a linear ARIMA model and a SETAR model. The study uses a monthly adjusted data of South African exchange rates with 420 observations. Akaike information criterion (AIC) and the Schwarz information criteria (SIC) are used for model selection. Mean absolute error (MAE), root mean squared error (RMSE) and mean absolute percentage error (MAPE) are error metrics used to evaluate forecast capability of the models. The Diebold –Mariano (DM) test is employed in the study to check forecast accuracy in order to distinguish the forecasting performance between the two models (ARIMA and SETAR). The results indicate that both models perform well when modelling and forecasting the exchange rates, but SETAR seemed to outperform ARIMA.Keywords: ARIMA, error metrices, model selection, SETAR
Procedia PDF Downloads 24424022 Towards A New Maturity Model for Information System
Authors: Ossama Matrane
Abstract:
Information System has become a strategic lever for enterprises. It contributes effectively to align business processes on strategies of enterprises. It is regarded as an increase in productivity and effectiveness. So, many organizations are currently involved in implementing sustainable Information System. And, a large number of studies have been conducted the last decade in order to define the success factors of information system. Thus, many studies on maturity model have been carried out. Some of this study is referred to the maturity model of Information System. In this article, we report on development of maturity models specifically designed for information system. This model is built based on three components derived from Maturity Model for Information Security Management, OPM3 for Project Management Maturity Model and processes of COBIT for IT governance. Thus, our proposed model defines three maturity stages for corporate a strong Information System to support objectives of organizations. It provides a very practical structure with which to assess and improve Information System Implementation.Keywords: information system, maturity models, information security management, OPM3, IT governance
Procedia PDF Downloads 44724021 Toward Automatic Chest CT Image Segmentation
Authors: Angely Sim Jia Wun, Sasa Arsovski
Abstract:
Numerous studies have been conducted on the segmentation of medical images. Segmenting the lungs is one of the common research topics in those studies. Our research stemmed from the lack of solutions for automatic bone, airway, and vessel segmentation, despite the existence of multiple lung segmentation techniques. Consequently, currently, available software tools used for medical image segmentation do not provide automatic lung, bone, airway, and vessel segmentation. This paper presents segmentation techniques along with an interactive software tool architecture for segmenting bone, lung, airway, and vessel tissues. Additionally, we propose a method for creating binary masks from automatically generated segments. The key contribution of our approach is the technique for automatic image thresholding using adjustable Hounsfield values and binary mask extraction. Generated binary masks can be successfully used as a training dataset for deep-learning solutions in medical image segmentation. In this paper, we also examine the current software tools used for medical image segmentation, discuss our approach, and identify its advantages.Keywords: lung segmentation, binary masks, U-Net, medical software tools
Procedia PDF Downloads 9824020 Comparative Study of Ecological City Criteria in Traditional Iranian Cities
Authors: Zahra Yazdani Paraii, Zohreh Yazdani Paraei
Abstract:
Many urban designers and planners have been involved in the design of environmentally friendly or nature adaptable urban development models due to increase in urban populations in the recent century, limitation on natural resources, climate change, and lack of enough water and food. Ecological city is one of the latest models proposed to accomplish the latter goal. In this work, the existing establishing indicators of the ecological city are used regarding energy, water, land use and transportation issues. The model is used to compare the function of traditional settlements of Iran. The result of investigation shows that the specifications and functions of the traditional settlements of Iran fit well into the ecological city model. It is found that the inhabitants of the old cities and villages in Iran had founded ecological cities based on their knowledge of the environment and its natural opportunities and limitations.Keywords: ecological city, traditional city, urban design, environment
Procedia PDF Downloads 25324019 Analysis of Multilayer Neural Network Modeling and Long Short-Term Memory
Authors: Danilo López, Nelson Vera, Luis Pedraza
Abstract:
This paper analyzes fundamental ideas and concepts related to neural networks, which provide the reader a theoretical explanation of Long Short-Term Memory (LSTM) networks operation classified as Deep Learning Systems, and to explicitly present the mathematical development of Backward Pass equations of the LSTM network model. This mathematical modeling associated with software development will provide the necessary tools to develop an intelligent system capable of predicting the behavior of licensed users in wireless cognitive radio networks.Keywords: neural networks, multilayer perceptron, long short-term memory, recurrent neuronal network, mathematical analysis
Procedia PDF Downloads 420