Search results for: continuous data
24190 R Data Science for Technology Management
Authors: Sunghae Jun
Abstract:
Technology management (TM) is important issue in a company improving the competitiveness. Among many activities of TM, technology analysis (TA) is important factor, because most decisions for management of technology are decided by the results of TA. TA is to analyze the developed results of target technology using statistics or Delphi. TA based on Delphi is depended on the experts’ domain knowledge, in comparison, TA by statistics and machine learning algorithms use objective data such as patent or paper instead of the experts’ knowledge. Many quantitative TA methods based on statistics and machine learning have been studied, and these have been used for technology forecasting, technological innovation, and management of technology. They applied diverse computing tools and many analytical methods case by case. It is not easy to select the suitable software and statistical method for given TA work. So, in this paper, we propose a methodology for quantitative TA using statistical computing software called R and data science to construct a general framework of TA. From the result of case study, we also show how our methodology is applied to real field. This research contributes to R&D planning and technology valuation in TM areas.Keywords: technology management, R system, R data science, statistics, machine learning
Procedia PDF Downloads 45824189 Mixture statistical modeling for predecting mortality human immunodeficiency virus (HIV) and tuberculosis(TB) infection patients
Authors: Mohd Asrul Affendi Bi Abdullah, Nyi Nyi Naing
Abstract:
The purpose of this study was to identify comparable manner between negative binomial death rate (NBDR) and zero inflated negative binomial death rate (ZINBDR) with died patients with (HIV + T B+) and (HIV + T B−). HIV and TB is a serious world wide problem in the developing country. Data were analyzed with applying NBDR and ZINBDR to make comparison which a favorable model is better to used. The ZINBDR model is able to account for the disproportionately large number of zero within the data and is shown to be a consistently better fit than the NBDR model. Hence, as a results ZINBDR model is a superior fit to the data than the NBDR model and provides additional information regarding the died mechanisms HIV+TB. The ZINBDR model is shown to be a use tool for analysis death rate according age categorical.Keywords: zero inflated negative binomial death rate, HIV and TB, AIC and BIC, death rate
Procedia PDF Downloads 43324188 Efficient Reuse of Exome Sequencing Data for Copy Number Variation Callings
Authors: Chen Wang, Jared Evans, Yan Asmann
Abstract:
With the quick evolvement of next-generation sequencing techniques, whole-exome or exome-panel data have become a cost-effective way for detection of small exonic mutations, but there has been a growing desire to accurately detect copy number variations (CNVs) as well. In order to address this research and clinical needs, we developed a sequencing coverage pattern-based method not only for copy number detections, data integrity checks, CNV calling, and visualization reports. The developed methodologies include complete automation to increase usability, genome content-coverage bias correction, CNV segmentation, data quality reports, and publication quality images. Automatic identification and removal of poor quality outlier samples were made automatically. Multiple experimental batches were routinely detected and further reduced for a clean subset of samples before analysis. Algorithm improvements were also made to improve somatic CNV detection as well as germline CNV detection in trio family. Additionally, a set of utilities was included to facilitate users for producing CNV plots in focused genes of interest. We demonstrate the somatic CNV enhancements by accurately detecting CNVs in whole exome-wide data from the cancer genome atlas cancer samples and a lymphoma case study with paired tumor and normal samples. We also showed our efficient reuses of existing exome sequencing data, for improved germline CNV calling in a family of the trio from the phase-III study of 1000 Genome to detect CNVs with various modes of inheritance. The performance of the developed method is evaluated by comparing CNV calling results with results from other orthogonal copy number platforms. Through our case studies, reuses of exome sequencing data for calling CNVs have several noticeable functionalities, including a better quality control for exome sequencing data, improved joint analysis with single nucleotide variant calls, and novel genomic discovery of under-utilized existing whole exome and custom exome panel data.Keywords: bioinformatics, computational genetics, copy number variations, data reuse, exome sequencing, next generation sequencing
Procedia PDF Downloads 25724187 [Keynote]: No-Trust-Zone Architecture for Securing Supervisory Control and Data Acquisition
Authors: Michael Okeke, Andrew Blyth
Abstract:
Supervisory Control And Data Acquisition (SCADA) as the state of the art Industrial Control Systems (ICS) are used in many different critical infrastructures, from smart home to energy systems and from locomotives train system to planes. Security of SCADA systems is vital since many lives depend on it for daily activities and deviation from normal operation could be disastrous to the environment as well as lives. This paper describes how No-Trust-Zone (NTZ) architecture could be incorporated into SCADA Systems in order to reduce the chances of malicious intent. The architecture is made up of two distinctive parts which are; the field devices such as; sensors, PLCs pumps, and actuators. The second part of the architecture is designed following lambda architecture, which is made up of a detection algorithm based on Particle Swarm Optimization (PSO) and Hadoop framework for data processing and storage. Apache Spark will be a part of the lambda architecture for real-time analysis of packets for anomalies detection.Keywords: industrial control system (ics, no-trust-zone (ntz), particle swarm optimisation (pso), supervisory control and data acquisition (scada), swarm intelligence (SI)
Procedia PDF Downloads 34524186 Implicit Transaction Costs and the Fundamental Theorems of Asset Pricing
Authors: Erindi Allaj
Abstract:
This paper studies arbitrage pricing theory in financial markets with transaction costs. We extend the existing theory to include the more realistic possibility that the price at which the investors trade is dependent on the traded volume. The investors in the market always buy at the ask and sell at the bid price. Transaction costs are composed of two terms, one is able to capture the implicit transaction costs and the other the price impact. Moreover, a new definition of a self-financing portfolio is obtained. The self-financing condition suggests that continuous trading is possible, but is restricted to predictable trading strategies which have left and right limit and finite quadratic variation. That is, predictable trading strategies of infinite variation and of finite quadratic variation are allowed in our setting. Within this framework, the existence of an equivalent probability measure is equivalent to the absence of arbitrage opportunities, so that the first fundamental theorem of asset pricing (FFTAP) holds. It is also proved that, when this probability measure is unique, any contingent claim in the market is hedgeable in an L2-sense. The price of any contingent claim is equal to the risk-neutral price. To better understand how to apply the theory proposed we provide an example with linear transaction costs.Keywords: arbitrage pricing theory, transaction costs, fundamental theorems of arbitrage, financial markets
Procedia PDF Downloads 36124185 A Study on the Correlation Analysis between the Pre-Sale Competition Rate and the Apartment Unit Plan Factor through Machine Learning
Authors: Seongjun Kim, Jinwooung Kim, Sung-Ah Kim
Abstract:
The development of information and communication technology also affects human cognition and thinking, especially in the field of design, new techniques are being tried. In architecture, new design methodologies such as machine learning or data-driven design are being applied. In particular, these methodologies are used in analyzing the factors related to the value of real estate or analyzing the feasibility in the early planning stage of the apartment housing. However, since the value of apartment buildings is often determined by external factors such as location and traffic conditions, rather than the interior elements of buildings, data is rarely used in the design process. Therefore, although the technical conditions are provided, the internal elements of the apartment are difficult to apply the data-driven design in the design process of the apartment. As a result, the designers of apartment housing were forced to rely on designer experience or modular design alternatives rather than data-driven design at the design stage, resulting in a uniform arrangement of space in the apartment house. The purpose of this study is to propose a methodology to support the designers to design the apartment unit plan with high consumer preference by deriving the correlation and importance of the floor plan elements of the apartment preferred by the consumers through the machine learning and reflecting this information from the early design process. The data on the pre-sale competition rate and the elements of the floor plan are collected as data, and the correlation between pre-sale competition rate and independent variables is analyzed through machine learning. This analytical model can be used to review the apartment unit plan produced by the designer and to assist the designer. Therefore, it is possible to make a floor plan of apartment housing with high preference because it is possible to feedback apartment unit plan by using trained model when it is used in floor plan design of apartment housing.Keywords: apartment unit plan, data-driven design, design methodology, machine learning
Procedia PDF Downloads 26824184 Nonparametric Truncated Spline Regression Model on the Data of Human Development Index in Indonesia
Authors: Kornelius Ronald Demu, Dewi Retno Sari Saputro, Purnami Widyaningsih
Abstract:
Human Development Index (HDI) is a standard measurement for a country's human development. Several factors may have influenced it, such as life expectancy, gross domestic product (GDP) based on the province's annual expenditure, the number of poor people, and the percentage of an illiterate people. The scatter plot between HDI and the influenced factors show that the plot does not follow a specific pattern or form. Therefore, the HDI's data in Indonesia can be applied with a nonparametric regression model. The estimation of the regression curve in the nonparametric regression model is flexible because it follows the shape of the data pattern. One of the nonparametric regression's method is a truncated spline. Truncated spline regression is one of the nonparametric approach, which is a modification of the segmented polynomial functions. The estimator of a truncated spline regression model was affected by the selection of the optimal knots point. Knot points is a focus point of spline truncated functions. The optimal knots point was determined by the minimum value of generalized cross validation (GCV). In this article were applied the data of Human Development Index with a truncated spline nonparametric regression model. The results of this research were obtained the best-truncated spline regression model to the HDI's data in Indonesia with the combination of optimal knots point 5-5-5-4. Life expectancy and the percentage of an illiterate people were the significant factors depend to the HDI in Indonesia. The coefficient of determination is 94.54%. This means the regression model is good enough to applied on the data of HDI in Indonesia.Keywords: generalized cross validation (GCV), Human Development Index (HDI), knots point, nonparametric regression, truncated spline
Procedia PDF Downloads 33924183 The Predictive Utility of Subjective Cognitive Decline Using Item Level Data from the Everyday Cognition (ECog) Scales
Authors: J. Fox, J. Randhawa, M. Chan, L. Campbell, A. Weakely, D. J. Harvey, S. Tomaszewski Farias
Abstract:
Early identification of individuals at risk for conversion to dementia provides an opportunity for preventative treatment. Many older adults (30-60%) report specific subjective cognitive decline (SCD); however, previous research is inconsistent in terms of what types of complaints predict future cognitive decline. The purpose of this study is to identify which specific complaints from the Everyday Cognition Scales (ECog) scales, a measure of self-reported concerns for everyday abilities across six cognitive domains, are associated with: 1) conversion from a clinical diagnosis of normal to either MCI or dementia (categorical variable) and 2) progressive cognitive decline in memory and executive function (continuous variables). 415 cognitively normal older adults were monitored annually for an average of 5 years. Cox proportional hazards models were used to assess associations between self-reported ECog items and progression to impairment (MCI or dementia). A total of 114 individuals progressed to impairment; the mean time to progression was 4.9 years (SD=3.4 years, range=0.8-13.8). Follow-up models were run controlling for depression. A subset of individuals (n=352) underwent repeat cognitive assessments for an average of 5.3 years. For those individuals, mixed effects models with random intercepts and slopes were used to assess associations between ECog items and change in neuropsychological measures of episodic memory or executive function. Prior to controlling for depression, subjective concerns on five of the eight Everyday Memory items, three of the nine Everyday Language items, one of the seven Everyday Visuospatial items, two of the five Everyday Planning items, and one of the six Everyday Organization items were associated with subsequent diagnostic conversion (HR=1.25 to 1.59, p=0.003 to 0.03). However, after controlling for depression, only two specific complaints of remembering appointments, meetings, and engagements and understanding spoken directions and instructions were associated with subsequent diagnostic conversion. Episodic memory in individuals reporting no concern on ECog items did not significantly change over time (p>0.4). More complaints on seven of the eight Everyday Memory items, three of the nine Everyday Language items, and three of the seven Everyday Visuospatial items were associated with a decline in episodic memory (Interaction estimate=-0.055 to 0.001, p=0.003 to 0.04). Executive function in those reporting no concern on ECog items declined slightly (p <0.001 to 0.06). More complaints on three of the eight Everyday Memory items and three of the nine Everyday Language items were associated with a decline in executive function (Interaction estimate=-0.021 to -0.012, p=0.002 to 0.04). These findings suggest that specific complaints across several cognitive domains are associated with diagnostic conversion. Specific complaints in the domains of Everyday Memory and Language are associated with a decline in both episodic memory and executive function. Increased monitoring and treatment of individuals with these specific SCD may be warranted.Keywords: alzheimer’s disease, dementia, memory complaints, mild cognitive impairment, risk factors, subjective cognitive decline
Procedia PDF Downloads 8024182 Impact of Protean Career Attitude on Career Success with the Mediating Effect of Career Insight
Authors: Prabhashini Wijewantha
Abstract:
This study looks at the impact of protean career attitude of employees on their career success and next it looks at the mediation effect of career insights on the above relationship. Career success is defined as the accomplishment of desirable work related outcomes at any point in person’s work experiences over time and it comprises of two sub variables, namely, career satisfaction and perceived employability. Protean career attitude was measured using the eight items from the Self Directedness subscale of the Protean Career Attitude scale developed by Briscoe and Hall, where as career satisfaction was measured by the three item scale developed by Martine, Eddleston, and Veiga. Perceived employability was also evaluated using three items and career insight was measured using fourteen items that were adapted and used by De Vos and Soens. Data were collected from a sample of 300 mid career executives in Sri Lanka deploying the survey strategy and data were analyzed using the SPSS and AMOS software version 20.0. A preliminary analysis of data was initially performed where data were screened and reliability and validity were ensured. Next a simple regression analysis was performed to test the direct impact of protean career attitude on career success and the hypothesis was supported. The Baron and Kenney’s four steps, three regressions approach for mediator testing was used to calculate the mediation effect of career insight on the above relationship and a partial mediation was supported by the data. Finally theoretical and practical implications are discussed.Keywords: career success, career insight, mid career MBAs, protean career attitude
Procedia PDF Downloads 36024181 Collaborative Rural Governance Strategy to Enhance Rural Economy Through Village-Owned Enterprise Using Soft System Methodology and Textual Network Analysis
Authors: Robert Saputra, Tomas Havlicek
Abstract:
This study discusses the design of collaborative rural governance strategies to enhance the rural economy through Village-owned Enterprises (VOE) in Riau Province, Indonesia. Using Soft Systems Methodology (SSM) combined with Textual Network Analysis (TNA) in the Rich Picture stage of SSM, we investigated the current state of VOE management. Significant obstacles identified include insufficient business feasibility analyses, lack of managerial skills, misalignment between strategy and practice, and inadequate oversight. To address these challenges, we propose a collaborative strategy involving regional governments, academic institutions, NGOs, and the private sector. This strategy emphasizes community needs assessments, efficient resource mobilization, and targeted training programs. A dedicated working group will ensure continuous monitoring and iterative improvements. Our research highlights the novel integration of SSM with TNA, providing a robust framework for improving VOE management and demonstrating the potential of collaborative efforts in driving rural economic development.Keywords: village-owned enterprises (VOE), rural economic development, soft system methodology (SSM), textual network analysis (TNA), collaborative governance
Procedia PDF Downloads 1524180 Studying the Influence of Systematic Pre-Occupancy Data Collection through Post-Occupancy Evaluation: A Shift in the Architectural Design Process
Authors: Noor Abdelhamid, Donovan Nelson, Cara Prosser
Abstract:
The architectural design process could be mapped out as a dialogue between designer and user that is constructed across multiple phases with the overarching goal of aligning design outcomes with user needs. Traditionally, this dialogue is bounded within a preliminary phase of determining factors that will direct the design intent, and a completion phase, of handing off the project to the client. Pre- and post-occupancy evaluations (P/POE’s) could provide an alternative process by extending this dialogue on both ends of the design process. The purpose of this research is to study the influence of systematic pre-occupancy data collection in achieving design goals by conducting post-occupancy evaluations of two case studies. In the context of this study, systematic pre-occupancy data collection is defined as the preliminary documentation of the existing conditions that helps portray stakeholders’ needs. When implemented, pre-occupancy occurs during the early phases of the architectural design process, utilizing the information to shape the design intent. Investigative POE’s are performed on two case studies with distinct early design approaches to understand how the current space is impacting user needs, establish design outcomes, and inform future strategies. The first case study underwent systematic pre-occupancy data collection and synthesis, while the other represents the traditional, uncoordinated practice of informally collecting data during an early design phase. POE’s target the dynamics between the building and its occupants by studying how spaces are serving the needs of the users. Data collection for this study consists of user surveys, audiovisual materials, and observations during regular site visits. Mixed methods of qualitative and quantitative analyses are synthesized to identify patterns in the data. The paper concludes by positioning value on both sides of the architectural design process: the integration of systematic pre-occupancy methods in the early phases and the reinforcement of a continued dialogue between building and design team after building completion.Keywords: architecture, design process, pre-occupancy data, post-occupancy evaluation
Procedia PDF Downloads 16424179 An Analysis of Oil Price Changes and Other Factors Affecting Iranian Food Basket: A Panel Data Method
Authors: Niloofar Ashktorab, Negar Ashktorab
Abstract:
Oil exports fund nearly half of Iran’s government expenditures, since many years other countries have been imposed different sanctions against Iran. Sanctions that primarily target Iran’s key energy sector have harmed Iran’s economy. The strategic effects of sanctions might be reduction as Iran adjusts to them economically. In this study, we evaluate the impact of oil price and sanctions against Iran on food commodity prices by using panel data method. Here, we find that the food commodity prices, the oil price and real exchange rate are stationary. The results show positive effect of oil price changes, real exchange rate and sanctions on food commodity prices.Keywords: oil price, food basket, sanctions, panel data, Iran
Procedia PDF Downloads 35624178 A Proposed Framework for Software Redocumentation Using Distributed Data Processing Techniques and Ontology
Authors: Laila Khaled Almawaldi, Hiew Khai Hang, Sugumaran A. l. Nallusamy
Abstract:
Legacy systems are crucial for organizations, but their intricacy and lack of documentation pose challenges for maintenance and enhancement. Redocumentation of legacy systems is vital for automatically or semi-automatically creating documentation for software lacking sufficient records. It aims to enhance system understandability, maintainability, and knowledge transfer. However, existing redocumentation methods need improvement in data processing performance and document generation efficiency. This stems from the necessity to efficiently handle the extensive and complex code of legacy systems. This paper proposes a method for semi-automatic legacy system re-documentation using semantic parallel processing and ontology. Leveraging parallel processing and ontology addresses current challenges by distributing the workload and creating documentation with logically interconnected data. The paper outlines challenges in legacy system redocumentation and suggests a method of redocumentation using parallel processing and ontology for improved efficiency and effectiveness.Keywords: legacy systems, redocumentation, big data analysis, parallel processing
Procedia PDF Downloads 4624177 Various Models of Quality Management Systems
Authors: Mehrnoosh Askarizadeh
Abstract:
People, process and IT are the most important assets of any organization. Optimal utilization of these resources has been the question of research in business for many decades. The business world have responded by inventing various methodologies that can be used for addressing problems of quality improvement, efficiency of processes, continuous improvement, reduction of waste, automation, strategy alignments etc. Some of these methodologies can be commonly called as Business Process Quality Management methodologies (BPQM). In essence, the first references to the process management can be traced back to Frederick Taylor and scientific management. Time and motion study was addressed to improvement of manufacturing process efficiency. The ideas of scientific management were in use for quite a long period until more advanced quality management techniques were developed in Japan and USA. One of the first prominent methods had been Total Quality Management (TQM) which evolved during 1980’s. About the same time, Six Sigma (SS) originated at Motorola as a separate method. SS spread and evolved; and later joined with ideas of Lean manufacturing to form Lean Six Sigma. In 1990’s due to emerging IT technologies, beginning of globalization, and strengthening of competition, companies recognized the need for better process and quality management. Business Process Management (BPM) emerged as a novel methodology that has taken all this into account and helped to align IT technologies with business processes and quality management. In this article we will study various aspects of above mentioned methods and identified their relations.Keywords: e-process, quality, TQM, BPM, lean, six sigma, CPI, information technology, management
Procedia PDF Downloads 44024176 Armenian Refugees in Early 20th C Japan: Quantitative Analysis on Their Number Based on Japanese Historical Data with the Comparison of a Foreign Historical Data
Authors: Meline Mesropyan
Abstract:
At the beginning of the 20th century, Japan served as a transit point for Armenian refugees fleeing the 1915 Genocide. However, research on Armenian refugees in Japan is sparse, and the Armenian Diaspora has never taken root in Japan. Consequently, Japan has not been considered a relevant research site for studying Armenian refugees. The primary objective of this study is to shed light on the number of Armenian refugees who passed through Japan between 1915 and 1930. Quantitative analyses will be conducted based on newly uncovered Japanese archival documents. Subsequently, the Japanese data will be compared to American immigration data to estimate the potential number of refugees in Japan during that period. This under-researched area is relevant to both the Armenian Diaspora and refugee studies in Japan. By clarifying the number of refugees, this study aims to enhance understanding of Japan's treatment of refugees and the extent of humanitarian efforts conducted by organizations and individuals in Japan, contributing to the broader field of historical refugee studies.Keywords: Armenian genocide, Armenian refugees, Japanese statistics, number of refugees
Procedia PDF Downloads 5724175 Building Green Infrastructure Networks Based on Cadastral Parcels Using Network Analysis
Authors: Gon Park
Abstract:
Seoul in South Korea established the 2030 Seoul City Master Plan that contains green-link projects to connect critical green areas within the city. However, the plan does not have detailed analyses for green infrastructure to incorporate land-cover information to many structural classes. This study maps green infrastructure networks of Seoul for complementing their green plans with identifying and raking green areas. Hubs and links of main elements of green infrastructure have been identified from incorporating cadastral data of 967,502 parcels to 135 of land use maps using geographic information system. Network analyses were used to rank hubs and links of a green infrastructure map with applying a force-directed algorithm, weighted values, and binary relationships that has metrics of density, distance, and centrality. The results indicate that network analyses using cadastral parcel data can be used as the framework to identify and rank hubs, links, and networks for the green infrastructure planning under a variable scenarios of green areas in cities.Keywords: cadastral data, green Infrastructure, network analysis, parcel data
Procedia PDF Downloads 20624174 Classification of Land Cover Usage from Satellite Images Using Deep Learning Algorithms
Authors: Shaik Ayesha Fathima, Shaik Noor Jahan, Duvvada Rajeswara Rao
Abstract:
Earth's environment and its evolution can be seen through satellite images in near real-time. Through satellite imagery, remote sensing data provide crucial information that can be used for a variety of applications, including image fusion, change detection, land cover classification, agriculture, mining, disaster mitigation, and monitoring climate change. The objective of this project is to propose a method for classifying satellite images according to multiple predefined land cover classes. The proposed approach involves collecting data in image format. The data is then pre-processed using data pre-processing techniques. The processed data is fed into the proposed algorithm and the obtained result is analyzed. Some of the algorithms used in satellite imagery classification are U-Net, Random Forest, Deep Labv3, CNN, ANN, Resnet etc. In this project, we are using the DeepLabv3 (Atrous convolution) algorithm for land cover classification. The dataset used is the deep globe land cover classification dataset. DeepLabv3 is a semantic segmentation system that uses atrous convolution to capture multi-scale context by adopting multiple atrous rates in cascade or in parallel to determine the scale of segments.Keywords: area calculation, atrous convolution, deep globe land cover classification, deepLabv3, land cover classification, resnet 50
Procedia PDF Downloads 14024173 The Effect of CPU Location in Total Immersion of Microelectronics
Authors: A. Almaneea, N. Kapur, J. L. Summers, H. M. Thompson
Abstract:
Meeting the growth in demand for digital services such as social media, telecommunications, and business and cloud services requires large scale data centres, which has led to an increase in their end use energy demand. Generally, over 30% of data centre power is consumed by the necessary cooling overhead. Thus energy can be reduced by improving the cooling efficiency. Air and liquid can both be used as cooling media for the data centre. Traditional data centre cooling systems use air, however liquid is recognised as a promising method that can handle the more densely packed data centres. Liquid cooling can be classified into three methods; rack heat exchanger, on-chip heat exchanger and full immersion of the microelectronics. This study quantifies the improvements of heat transfer specifically for the case of immersed microelectronics by varying the CPU and heat sink location. Immersion of the server is achieved by filling the gap between the microelectronics and a water jacket with a dielectric liquid which convects the heat from the CPU to the water jacket on the opposite side. Heat transfer is governed by two physical mechanisms, which is natural convection for the fixed enclosure filled with dielectric liquid and forced convection for the water that is pumped through the water jacket. The model in this study is validated with published numerical and experimental work and shows good agreement with previous work. The results show that the heat transfer performance and Nusselt number (Nu) is improved by 89% by placing the CPU and heat sink on the bottom of the microelectronics enclosure.Keywords: CPU location, data centre cooling, heat sink in enclosures, immersed microelectronics, turbulent natural convection in enclosures
Procedia PDF Downloads 27224172 A Macroeconomic Analysis of Defense Industry: Comparisons, Trends and Improvements in Brazil and in the World
Authors: J. Fajardo, J. Guerra, E. Gonzales
Abstract:
This paper will outline a study of Brazil's industrial base of defense (IDB), through a bibliographic research method, combined with an analysis of macroeconomic data from several available public data platforms. This paper begins with a brief study about Brazilian national industry, including analyzes of productivity, income, outcome and jobs. Next, the research presents a study on the defense industry in Brazil, presenting the main national companies that operate in the aeronautical, army and naval branches. After knowing the main points of the Brazilian defense industry, data on the productivity of the defense industry of the main countries and competing companies of the Brazilian industry were analyzed, in order to summarize big cases in Brazil with a comparative analysis. Concerned the methodology, were used bibliographic research and the exploration of historical data series, in order to analyze information, to get trends and to make comparisons along the time. The research is finished with the main trends for the development of the Brazilian defense industry, comparing the current situation with the point of view of several countries.Keywords: economics of defence, industry, trends, market
Procedia PDF Downloads 15624171 Delineating Subsurface Linear Features and Faults Under Sedimentary Cover in the Bahira Basin Using Integrated Gravity and Magnetic Data
Authors: M. Lghoul, N. El Goumi, M. Guernouche
Abstract:
In order to predict the structural and tectonic framework of the Bahira basin and to have a 3D geological modeling of the basin, an integrated multidisciplinary work has been conducted using gravity, magnetic and geological data. The objective of the current study is delineating the subsurfacefeatures, faults, and geological limits, using airborne magnetic and gravity data analysis of the Bahira basin. To achieve our goal, we have applied different enhanced techniques on magnetic and gravity data: power spectral analysis techniques, reduction to pole (RTP), upward continuation, analytical signal, tilt derivative, total horizontal derivative, 3D Euler deconvolutionand source parameter imagining. The major lineaments/faults trend are: NE–SW, NW-SE, ENE–WSW, and WNW–ESE. The 3D Euler deconvolution analysis highlighted a number of fault trend, mainly in the ENE-WSW, WNW-ESE directions. The depth tothe top of the basement sources in the study area ranges between 200 m, in the southern and northern part of the Bahira basin, to 5000 m located in the Eastern part of the basin.Keywords: magnetic, gravity, structural trend, depth to basement
Procedia PDF Downloads 13224170 Copyright Clearance for Artificial Intelligence Training Data: Challenges and Solutions
Authors: Erva Akin
Abstract:
– The use of copyrighted material for machine learning purposes is a challenging issue in the field of artificial intelligence (AI). While machine learning algorithms require large amounts of data to train and improve their accuracy and creativity, the use of copyrighted material without permission from the authors may infringe on their intellectual property rights. In order to overcome copyright legal hurdle against the data sharing, access and re-use of data, the use of copyrighted material for machine learning purposes may be considered permissible under certain circumstances. For example, if the copyright holder has given permission to use the data through a licensing agreement, then the use for machine learning purposes may be lawful. It is also argued that copying for non-expressive purposes that do not involve conveying expressive elements to the public, such as automated data extraction, should not be seen as infringing. The focus of such ‘copy-reliant technologies’ is on understanding language rules, styles, and syntax and no creative ideas are being used. However, the non-expressive use defense is within the framework of the fair use doctrine, which allows the use of copyrighted material for research or educational purposes. The questions arise because the fair use doctrine is not available in EU law, instead, the InfoSoc Directive provides for a rigid system of exclusive rights with a list of exceptions and limitations. One could only argue that non-expressive uses of copyrighted material for machine learning purposes do not constitute a ‘reproduction’ in the first place. Nevertheless, the use of machine learning with copyrighted material is difficult because EU copyright law applies to the mere use of the works. Two solutions can be proposed to address the problem of copyright clearance for AI training data. The first is to introduce a broad exception for text and data mining, either mandatorily or for commercial and scientific purposes, or to permit the reproduction of works for non-expressive purposes. The second is that copyright laws should permit the reproduction of works for non-expressive purposes, which opens the door to discussions regarding the transposition of the fair use principle from the US into EU law. Both solutions aim to provide more space for AI developers to operate and encourage greater freedom, which could lead to more rapid innovation in the field. The Data Governance Act presents a significant opportunity to advance these debates. Finally, issues concerning the balance of general public interests and legitimate private interests in machine learning training data must be addressed. In my opinion, it is crucial that robot-creation output should fall into the public domain. Machines depend on human creativity, innovation, and expression. To encourage technological advancement and innovation, freedom of expression and business operation must be prioritised.Keywords: artificial intelligence, copyright, data governance, machine learning
Procedia PDF Downloads 8324169 Effects of Cell Phone Electromagnetic Radiation on the Brain System
Authors: A. Alao Olumuyiwa
Abstract:
Health hazards reported to be associated with exposure to electromagnetic radiations which include brain tumors, genotoxic effects, neurological effects, immune system deregulation, allergic responses and some cardiovascular effects are discussed under a closed tabular model in this study. This review however showed that there is strong and robust evidence that chronic exposures to electromagnetic frequency across the spectrum, through strength, consistency, biological plausibility and many dose-response relationships, may result in brain cancer and other carcinogenic disease symptoms. There is therefore no safe threshold because of the genotoxic nature of the mechanism that may however be involved. The discussed study explains that the cell phone has induced effects upon the blood –brain barrier permeability and the cerebellum exposure to continuous long hours RF radiation may result in significant increase in albumin extravasations. A physical Biomodeling approach is however employed to review this health effects using Specific Absorption Rate (SAR) of different GSM machines to critically examine the symptoms such as a decreased loco motor activity, increased grooming and reduced memory functions in a variety of animal spices in classified grouped and sub grouped models.Keywords: brain cancer, electromagnetic radiations, physical biomodeling, specific absorption rate (SAR)
Procedia PDF Downloads 34724168 Biosorption of Phenol onto Water Hyacinth Activated Carbon: Kinetics and Isotherm Study
Authors: Manoj Kumar Mahapatra, Arvind Kumar
Abstract:
Batch adsorption experiments were carried out for the removal of phenol from its aqueous solution using water hyancith activated carbon (WHAC) as an adsorbent. The sorption kinetics were analysed using pseudo-first order kinetics and pseudo-second order model, and it was observed that the sorption data tend to fit very well in pseudo-second order model for the entire sorption time. The experimental data were analyzed by the Langmuir and Freundlich isotherm models. Equilibrium data fitted well to the Freundlich model with a maximum biosorption capacity of 31.45 mg/g estimated using Langmuir model. The adsorption intensity 3.7975 represents a favorable adsorption condition.Keywords: adsorption, isotherm, kinetics, phenol
Procedia PDF Downloads 44624167 An Economic Order Quantity Model for Deteriorating Items with Ramp Type Demand, Time Dependent Holding Cost and Price Discount Offered on Backorders
Authors: Arjun Paul, Adrijit Goswami
Abstract:
In our present work, an economic order quantity inventory model with shortages is developed where holding cost is expressed as linearly increasing function of time and demand rate is a ramp type function of time. The items considered in the model are deteriorating in nature so that a small fraction of the items is depleted with the passage of time. In order to consider a more realistic situation, the deterioration rate is assumed to follow a continuous uniform distribution with the parameters involved being triangular fuzzy numbers. The inventory manager offers his customer a discount in case he is willing to backorder his demand when there is a stock-out. The optimum ordering policy and the optimum discount offered for each backorder are determined by minimizing the total cost in a replenishment interval. For better illustration of our proposed model in both the crisp and fuzzy sense and for providing richer insights, a numerical example is cited to exemplify the policy and to analyze the sensitivity of the model parameters.Keywords: fuzzy deterioration rate, price discount on backorder, ramp type demand, shortage, time varying holding cost
Procedia PDF Downloads 19724166 Joint Physical Custody after Divorce and Child Well-Being
Authors: Katarzyna Kamińska
Abstract:
Joint physical custody means that both parents after divorce or separation have the right and responsibility to take care of the child on the daily basis. In a joint physical custody arrangement, the child spends substantial, but not necessarily equal, time with both parents. Joint physical custody can be symmetric care arrangement or not. However, it is accepted in the jurisprudence that the best interests of the child is served when the child spends at least 35% of the time during a two-week period with each parent. Joint physical custody, also known as joint, dual, or shared residence, is a challenge in contemporary family law. It has its supporters and opponents. On the one hand, joint physical custody is beneficial because it provides children with frequent and continuous contact with a mother and father after their divorce or separation. On the other hand, it isn’t good for children to be shuttled back and forth between two residences. Children need a home base. The conclusion is therefore that joint physical custody can’t be seen as a panacea for all post-divorce or post-separation parenting cases and the court shouldn’t automatically make such a determination. The possibility to award this arrangement requires the court to carefully weigh the pros and cons of each individual case. It is difficult to say that joint physical custody is better than single physical custody in any case. It depends on the circumstances and needs of each family. It appears that an individual approach is going to be much better as opposed to a one-size-fits-all idea.Keywords: joint physical custody, shared residence, dual residence, the best interests of the child
Procedia PDF Downloads 9524165 A West Coast Estuarine Case Study: A Predictive Approach to Monitor Estuarine Eutrophication
Authors: Vedant Janapaty
Abstract:
Estuaries are wetlands where fresh water from streams mixes with salt water from the sea. Also known as “kidneys of our planet”- they are extremely productive environments that filter pollutants, absorb floods from sea level rise, and shelter a unique ecosystem. However, eutrophication and loss of native species are ailing our wetlands. There is a lack of uniform data collection and sparse research on correlations between satellite data and in situ measurements. Remote sensing (RS) has shown great promise in environmental monitoring. This project attempts to use satellite data and correlate metrics with in situ observations collected at five estuaries. Images for satellite data were processed to calculate 7 bands (SIs) using Python. Average SI values were calculated per month for 23 years. Publicly available data from 6 sites at ELK was used to obtain 10 parameters (OPs). Average OP values were calculated per month for 23 years. Linear correlations between the 7 SIs and 10 OPs were made and found to be inadequate (correlation = 1 to 64%). Fourier transform analysis on 7 SIs was performed. Dominant frequencies and amplitudes were extracted for 7 SIs, and a machine learning(ML) model was trained, validated, and tested for 10 OPs. Better correlations were observed between SIs and OPs, with certain time delays (0, 3, 4, 6 month delay), and ML was again performed. The OPs saw improved R² values in the range of 0.2 to 0.93. This approach can be used to get periodic analyses of overall wetland health with satellite indices. It proves that remote sensing can be used to develop correlations with critical parameters that measure eutrophication in situ data and can be used by practitioners to easily monitor wetland health.Keywords: estuary, remote sensing, machine learning, Fourier transform
Procedia PDF Downloads 10424164 Agricultural Water Consumption Estimation in the Helmand Basin
Authors: Mahdi Akbari, Ali Torabi Haghighi
Abstract:
Hamun Lakes, located in the Helmand Basin, consisting of four water bodies, were the greatest (>8500 km2) freshwater bodies in Iran plateau but have almost entirely desiccated over the last 20 years. The desiccation of the lakes caused dust storm in the region which has huge economic and health consequences on the inhabitants. The flow of the Hirmand (or Helmand) River, the most important feeding river, has decreased from 4 to 1.9 km3 downstream due to anthropogenic activities. In this basin, water is mainly consumed for farming. Due to the lack of in-situ data in the basin, this research utilizes remote-sensing data to show how croplands and consequently consumed water in the agricultural sector have changed. Based on Landsat NDVI, we suggest using a threshold of around 0.35-0.4 to detect croplands in the basin. Croplands of this basin has doubled since 1990, especially in the downstream of the Kajaki Dam (the biggest dam of the basin). Using PML V2 Actual Evapotranspiration (AET) data and considering irrigation efficiency (≈0.3), we estimate that the consumed water (CW) for farming. We found that CW has increased from 2.5 to over 7.5 km3 from 2002 to 2017 in this basin. Also, the annual average Potential Evapotranspiration (PET) of the basin has had a negative trend in the recent years, although the AET over croplands has an increasing trend. In this research, using remote sensing data, we covered lack of data in the studied area and highlighted anthropogenic activities in the upstream which led to the lakes desiccation in the downstream.Keywords: Afghanistan-Iran transboundary Basin, Iran-Afghanistan water treaty, water use, lake desiccation
Procedia PDF Downloads 13124163 Micropillar-Assisted Electric Field Enhancement for High-Efficiency Inactivation of Bacteria
Authors: Sanam Pudasaini, A. T. K. Perera, Ahmed Syed Shaheer Uddin, Sum Huan Ng, Chun Yang
Abstract:
Development of high-efficiency and environment friendly bacterial inactivation methods is of great importance for preventing waterborne diseases which are one of the leading causes of death in the world. Traditional bacterial inactivation methods (e.g., ultraviolet radiation and chlorination) have several limitations such as longer treatment time, formation of toxic byproducts, bacterial regrowth, etc. Recently, an electroporation-based inactivation method was introduced as a substitute. Here, an electroporation-based continuous flow microfluidic device equipped with an array of micropillars is developed, and the device achieved high bacterial inactivation performance ( > 99.9%) within a short exposure time ( < 1 s). More than 99.9% reduction of Escherichia coli bacteria was obtained for the flow rate of 1 mL/hr, and no regrowth of bacteria was observed. Images from scanning electron microscope confirmed the formation of electroporation-induced nano-pore within the cell membrane. Through numerical simulation, it has been shown that sufficiently large electric field strength (3 kV/cm), required for bacterial electroporation, were generated using PDMS micropillars for an applied voltage of 300 V. Further, in this method of inactivation, there is no involvement of chemicals and the formation of harmful by-products is also minimum.Keywords: electroporation, high-efficiency, inactivation, microfluidics, micropillar
Procedia PDF Downloads 18024162 The Impact of Artificial Intelligence on Higher Education in Latin America
Authors: Luis Rodrigo Valencia Perez, Francisco Flores Aguero, Gibran Aguilar Rangel
Abstract:
Artificial Intelligence (AI) is rapidly transforming diverse sectors, and higher education in Latin America is no exception. This article explores the impact of AI on higher education institutions in the region, highlighting the imperative need for well-trained teachers in emerging technologies and a cultural shift towards the adoption and efficient use of these tools. AI offers significant opportunities to improve learning personalization, optimize administrative processes, and promote more inclusive and accessible education. However, the effectiveness of its implementation depends largely on the preparation and willingness of teachers to integrate these technologies into their pedagogical practices. Furthermore, it is essential that Latin American countries develop and implement public policies that encourage the adoption of AI in the education sector, thus ensuring that institutions can compete globally. Policies should focus on the continuous training of educators, investment in technological infrastructure, and the creation of regulatory frameworks that promote innovation and the ethical use of AI. Only through a comprehensive and collaborative approach will it be possible to fully harness the potential of AI to transform higher education in Latin America, thereby boosting the region's development and competitiveness on the global stage.Keywords: artificial intelligence (AI), higher education, teacher training, public policies, latin america, global competitiveness
Procedia PDF Downloads 2824161 Data-Driven Strategies for Enhancing Food Security in Vulnerable Regions: A Multi-Dimensional Analysis of Crop Yield Predictions, Supply Chain Optimization, and Food Distribution Networks
Authors: Sulemana Ibrahim
Abstract:
Food security remains a paramount global challenge, with vulnerable regions grappling with issues of hunger and malnutrition. This study embarks on a comprehensive exploration of data-driven strategies aimed at ameliorating food security in such regions. Our research employs a multifaceted approach, integrating data analytics to predict crop yields, optimizing supply chains, and enhancing food distribution networks. The study unfolds as a multi-dimensional analysis, commencing with the development of robust machine learning models harnessing remote sensing data, historical crop yield records, and meteorological data to foresee crop yields. These predictive models, underpinned by convolutional and recurrent neural networks, furnish critical insights into anticipated harvests, empowering proactive measures to confront food insecurity. Subsequently, the research scrutinizes supply chain optimization to address food security challenges, capitalizing on linear programming and network optimization techniques. These strategies intend to mitigate loss and wastage while streamlining the distribution of agricultural produce from field to fork. In conjunction, the study investigates food distribution networks with a particular focus on network efficiency, accessibility, and equitable food resource allocation. Network analysis tools, complemented by data-driven simulation methodologies, unveil opportunities for augmenting the efficacy of these critical lifelines. This study also considers the ethical implications and privacy concerns associated with the extensive use of data in the realm of food security. The proposed methodology outlines guidelines for responsible data acquisition, storage, and usage. The ultimate aspiration of this research is to forge a nexus between data science and food security policy, bestowing actionable insights to mitigate the ordeal of food insecurity. The holistic approach converging data-driven crop yield forecasts, optimized supply chains, and improved distribution networks aspire to revitalize food security in the most vulnerable regions, elevating the quality of life for millions worldwide.Keywords: data-driven strategies, crop yield prediction, supply chain optimization, food distribution networks
Procedia PDF Downloads 62