Search results for: wireless mesh network (WMN)
686 Metal-Organic Frameworks for Innovative Functional Textiles
Authors: Hossam E. Emam
Abstract:
Metal–organic frameworks (MOFs) are new hybrid materials investigated from 15 years ago; they synthesized from metals as inorganic center joined with multidentate organic linkers to form a 1D, 2D or 3D network structure. MOFs have unique properties such as pore crystalline structure, large surface area, chemical tenability and luminescent characters. These significant properties enable MOFs to be applied in many fields such like gas storage, adsorption/separation, drug delivery/biomedicine, catalysis, polymerization, magnetism and luminescence applications. Recently, many of published reports interested in superiority of MOFs for functionalization of textiles to exploit the unique properties of MOFs. Incorporation of MOFs is found to acquire the textiles some additional formidable functions to be used in considerable fields such like water treatment and fuel purification. Modification of textiles with MOFs could be easily performed by two main techniques; Ex-situ (preparation of MOFs then applied onto textiles) and in-situ (ingrowth of MOFs within textiles networks). Uniqueness of MOFs could be assimilated in acquirement of decorative color, antimicrobial character, anti-mosquitos character, ultraviolet radiation protective, self-clean, photo-luminescent and sensor character. Additionally, textiles treatment with MOFs make it applicable as filter in the adsorption of toxic gases, hazardous materials (such as pesticides, dyes and aromatics molecules) and fuel purification (such as removal of oxygenated, nitrogenated and sulfur compounds). Also, the porous structure of MOFs make it mostly utilized in control release of insecticides from the surface of the textile. Moreover, MOF@textiles as recyclable materials lead it applicable as photo-catalyst composites for photo-degradation of different dyes in the day light. Therefore, MOFs is extensively considered for imparting textiles with formidable properties as ingeniousness way for textile functionalization.Keywords: MOF, functional textiles, water treatment, fuel purification, environmental applications
Procedia PDF Downloads 147685 Source Identification Model Based on Label Propagation and Graph Ordinary Differential Equations
Authors: Fuyuan Ma, Yuhan Wang, Junhe Zhang, Ying Wang
Abstract:
Identifying the sources of information dissemination is a pivotal task in the study of collective behaviors in networks, enabling us to discern and intercept the critical pathways through which information propagates from its origins. This allows for the control of the information’s dissemination impact in its early stages. Numerous methods for source detection rely on pre-existing, underlying propagation models as prior knowledge. Current models that eschew prior knowledge attempt to harness label propagation algorithms to model the statistical characteristics of propagation states or employ Graph Neural Networks (GNNs) for deep reverse modeling of the diffusion process. These approaches are either deficient in modeling the propagation patterns of information or are constrained by the over-smoothing problem inherent in GNNs, which limits the stacking of sufficient model depth to excavate global propagation patterns. Consequently, we introduce the ODESI model. Initially, the model employs a label propagation algorithm to delineate the distribution density of infected states within a graph structure and extends the representation of infected states from integers to state vectors, which serve as the initial states of nodes. Subsequently, the model constructs a deep architecture based on GNNs-coupled Ordinary Differential Equations (ODEs) to model the global propagation patterns of continuous propagation processes. Addressing the challenges associated with solving ODEs on graphs, we approximate the analytical solutions to reduce computational costs. Finally, we conduct simulation experiments on two real-world social network datasets, and the results affirm the efficacy of our proposed ODESI model in source identification tasks.Keywords: source identification, ordinary differential equations, label propagation, complex networks
Procedia PDF Downloads 23684 A Hybrid-Evolutionary Optimizer for Modeling the Process of Obtaining Bricks
Authors: Marius Gavrilescu, Sabina-Adriana Floria, Florin Leon, Silvia Curteanu, Costel Anton
Abstract:
Natural sciences provide a wide range of experimental data whose related problems require study and modeling beyond the capabilities of conventional methodologies. Such problems have solution spaces whose complexity and high dimensionality require correspondingly complex regression methods for proper characterization. In this context, we propose an optimization method which consists in a hybrid dual optimizer setup: a global optimizer based on a modified variant of the popular Imperialist Competitive Algorithm (ICA), and a local optimizer based on a gradient descent approach. The ICA is modified such that intermediate solution populations are more quickly and efficiently pruned of low-fitness individuals by appropriately altering the assimilation, revolution and competition phases, which, combined with an initialization strategy based on low-discrepancy sampling, allows for a more effective exploration of the corresponding solution space. Subsequently, gradient-based optimization is used locally to seek the optimal solution in the neighborhoods of the solutions found through the modified ICA. We use this combined approach to find the optimal configuration and weights of a fully-connected neural network, resulting in regression models used to characterize the process of obtained bricks using silicon-based materials. Installations in the raw ceramics industry, i.e., bricks, are characterized by significant energy consumption and large quantities of emissions. Thus, the purpose of our approach is to determine by simulation the working conditions, including the manufacturing mix recipe with the addition of different materials, to minimize the emissions represented by CO and CH4. Our approach determines regression models which perform significantly better than those found using the traditional ICA for the aforementioned problem, resulting in better convergence and a substantially lower error.Keywords: optimization, biologically inspired algorithm, regression models, bricks, emissions
Procedia PDF Downloads 82683 Hybrid CNN-SAR and Lee Filtering for Enhanced InSAR Phase Unwrapping and Coherence Optimization
Authors: Hadj Sahraoui Omar, Kebir Lahcen Wahib, Bennia Ahmed
Abstract:
Interferometric Synthetic Aperture Radar (InSAR) coherence is a crucial parameter for accurately monitoring ground deformation and environmental changes. However, coherence can be degraded by various factors such as temporal decorrelation, atmospheric disturbances, and geometric misalignments, limiting the reliability of InSAR measurements (Omar Hadj‐Sahraoui and al. 2019). To address this challenge, we propose an innovative hybrid approach that combines artificial intelligence (AI) with advanced filtering techniques to optimize interferometric coherence in InSAR data. Specifically, we introduce a Convolutional Neural Network (CNN) integrated with the Lee filter to enhance the performance of radar interferometry. This hybrid method leverages the strength of CNNs to automatically identify and mitigate the primary sources of decorrelation, while the Lee filter effectively reduces speckle noise, improving the overall quality of interferograms. We develop a deep learning-based model trained on multi-temporal and multi-frequency SAR datasets, enabling it to predict coherence patterns and enhance low-coherence regions. This hybrid CNN-SAR with Lee filtering significantly reduces noise and phase unwrapping errors, leading to more precise deformation maps. Experimental results demonstrate that our approach improves coherence by up to 30% compared to traditional filtering techniques, making it a robust solution for challenging scenarios such as urban environments, vegetated areas, and rapidly changing landscapes. Our method has potential applications in geohazard monitoring, urban planning, and environmental studies, offering a new avenue for enhancing InSAR data reliability through AI-powered optimization combined with robust filtering techniques.Keywords: CNN-SAR, Lee Filter, hybrid optimization, coherence, InSAR phase unwrapping, speckle noise reduction
Procedia PDF Downloads 14682 Aerial Survey and 3D Scanning Technology Applied to the Survey of Cultural Heritage of Su-Paiwan, an Aboriginal Settlement, Taiwan
Authors: April Hueimin Lu, Liangj-Ju Yao, Jun-Tin Lin, Susan Siru Liu
Abstract:
This paper discusses the application of aerial survey technology and 3D laser scanning technology in the surveying and mapping work of the settlements and slate houses of the old Taiwanese aborigines. The relics of old Taiwanese aborigines with thousands of history are widely distributed in the deep mountains of Taiwan, with a vast area and inconvenient transportation. When constructing the basic data of cultural assets, it is necessary to apply new technology to carry out efficient and accurate settlement mapping work. In this paper, taking the old Paiwan as an example, the aerial survey of the settlement of about 5 hectares and the 3D laser scanning of a slate house were carried out. The obtained orthophoto image was used as an important basis for drawing the settlement map. This 3D landscape data of topography and buildings derived from the aerial survey is important for subsequent preservation planning as well as building 3D scan provides a more detailed record of architectural forms and materials. The 3D settlement data from the aerial survey can be further applied to the 3D virtual model and animation of the settlement for virtual presentation. The information from the 3D scanning of the slate house can also be used for further digital archives and data queries through network resources. The results of this study show that, in large-scale settlement surveys, aerial surveying technology is used to construct the topography of settlements with buildings and spatial information of landscape, as well as the application of 3D scanning for small-scale records of individual buildings. This application of 3D technology, greatly increasing the efficiency and accuracy of survey and mapping work of aboriginal settlements, is much helpful for further preservation planning and rejuvenation of aboriginal cultural heritage.Keywords: aerial survey, 3D scanning, aboriginal settlement, settlement architecture cluster, ecological landscape area, old Paiwan settlements, slat house, photogrammetry, SfM, MVS), Point cloud, SIFT, DSM, 3D model
Procedia PDF Downloads 173681 Identification of Blood Biomarkers Unveiling Early Alzheimer's Disease Diagnosis Through Single-Cell RNA Sequencing Data and Autoencoders
Authors: Hediyeh Talebi, Shokoofeh Ghiam, Changiz Eslahchi
Abstract:
Traditionally, Alzheimer’s disease research has focused on genes with significant fold changes, potentially neglecting subtle but biologically important alterations. Our study introduces an integrative approach that highlights genes crucial to underlying biological processes, regardless of their fold change magnitude. Alzheimer's Single-cell RNA-seq data related to the peripheral blood mononuclear cells (PBMC) was extracted from the Gene Expression Omnibus (GEO). After quality control, normalization, scaling, batch effect correction, and clustering, differentially expressed genes (DEGs) were identified with adjusted p-values less than 0.05. These DEGs were categorized based on cell-type, resulting in four datasets, each corresponding to a distinct cell type. To distinguish between cells from healthy individuals and those with Alzheimer's, an adversarial autoencoder with a classifier was employed. This allowed for the separation of healthy and diseased samples. To identify the most influential genes in this classification, the weight matrices in the network, which includes the encoder and classifier components, were multiplied, and focused on the top 20 genes. The analysis revealed that while some of these genes exhibit a high fold change, others do not. These genes, which may be overlooked by previous methods due to their low fold change, were shown to be significant in our study. The findings highlight the critical role of genes with subtle alterations in diagnosing Alzheimer's disease, a facet frequently overlooked by conventional methods. These genes demonstrate remarkable discriminatory power, underscoring the need to integrate biological relevance with statistical measures in gene prioritization. This integrative approach enhances our understanding of the molecular mechanisms in Alzheimer’s disease and provides a promising direction for identifying potential therapeutic targets.Keywords: alzheimer's disease, single-cell RNA-seq, neural networks, blood biomarkers
Procedia PDF Downloads 67680 Improving the Penalty-free Multi-objective Evolutionary Design Optimization of Water Distribution Systems
Authors: Emily Kambalame
Abstract:
Water distribution networks necessitate many investments for construction, prompting researchers to seek cost reduction and efficient design solutions. Optimization techniques are employed in this regard to address these challenges. In this context, the penalty-free multi-objective evolutionary algorithm (PFMOEA) coupled with pressure-dependent analysis (PDA) was utilized to develop a multi-objective evolutionary search for the optimization of water distribution systems (WDSs). The aim of this research was to find out if the computational efficiency of the PFMOEA for WDS optimization could be enhanced. This was done by applying real coding representation and retaining different percentages of feasible and infeasible solutions close to the Pareto front in the elitism step of the optimization. Two benchmark network problems, namely the Two-looped and Hanoi networks, were utilized in the study. A comparative analysis was then conducted to assess the performance of the real-coded PFMOEA in relation to other approaches described in the literature. The algorithm demonstrated competitive performance for the two benchmark networks by implementing real coding. The real-coded PFMOEA achieved the novel best-known solutions ($419,000 and $6.081 million) and a zero-pressure deficit for the two networks, requiring fewer function evaluations than the binary-coded PFMOEA. In previous PFMOEA studies, elitism applied a default retention of 30% of the least cost-feasible solutions while excluding all infeasible solutions. It was found in this study that by replacing 10% and 15% of the feasible solutions with infeasible ones that are close to the Pareto front with minimal pressure deficit violations, the computational efficiency of the PFMOEA was significantly enhanced. The configuration of 15% feasible and 15% infeasible solutions outperformed other retention allocations by identifying the optimal solution with the fewest function evaluationKeywords: design optimization, multi-objective evolutionary, penalty-free, water distribution systems
Procedia PDF Downloads 63679 An Analysis of the Representation of the Translator and Translation Process into Brazilian Social Networking Groups
Authors: Érica Lima
Abstract:
In the digital era, in which we have an avalanche of information, it is not new that the Internet has brought new modes of communication and knowledge access. Characterized by the multiplicity of discourses, opinions, beliefs and cultures, the web is a space of political-ideological dimensions where people (who often do not know each other) interact and create representations, deconstruct stereotypes, and redefine identities. Currently, the translator needs to be able to deal with digital spaces ranging from specific software to social media, which inevitably impact on his professional life. One of the most impactful ways of being seen in cyberspace is the participation in social networking groups. In addition to its ability to disseminate information among participants, social networking groups allow a significant personal and social exposure. Such exposure is due to the visibility of each participant achieved not only on its personal profile page, but also in each comment or post the person makes in the groups. The objective of this paper is to study the representations of translators and translation process on the Internet, more specifically in publications in two Brazilian groups of great influence on the Facebook: "Translators/Interpreters" and "Translators, Interpreters and Curious". These chosen groups represent the changes the network has brought to the profession, including the way translators are seen and see themselves. The analyzed posts allowed a reading of what common sense seems to think about the translator as opposed to what the translators seem to think about themselves as a professional class. The results of the analysis lead to the conclusion that these two positions are antagonistic and sometimes represent conflict of interests: on the one hand, the society in general consider the translator’s work something easy, therefore it is not necessary to be well remunerated; on the other hand, the translators who know how complex a translation process is and how much it takes to be a good professional. The results also reveal that social networking sites such as Facebook provide more visibility, but it takes a more active role from the translator to achieve a greater appreciation of the profession and more recognition of the role of the translator, especially in face of increasingly development of automatic translation programs.Keywords: Facebook, social representation, translation, translator
Procedia PDF Downloads 149678 The Systems Biology Verification Endeavor: Harness the Power of the Crowd to Address Computational and Biological Challenges
Authors: Stephanie Boue, Nicolas Sierro, Julia Hoeng, Manuel C. Peitsch
Abstract:
Systems biology relies on large numbers of data points and sophisticated methods to extract biologically meaningful signal and mechanistic understanding. For example, analyses of transcriptomics and proteomics data enable to gain insights into the molecular differences in tissues exposed to diverse stimuli or test items. Whereas the interpretation of endpoints specifically measuring a mechanism is relatively straightforward, the interpretation of big data is more complex and would benefit from comparing results obtained with diverse analysis methods. The sbv IMPROVER project was created to implement solutions to verify systems biology data, methods, and conclusions. Computational challenges leveraging the wisdom of the crowd allow benchmarking methods for specific tasks, such as signature extraction and/or samples classification. Four challenges have already been successfully conducted and confirmed that the aggregation of predictions often leads to better results than individual predictions and that methods perform best in specific contexts. Whenever the scientific question of interest does not have a gold standard, but may greatly benefit from the scientific community to come together and discuss their approaches and results, datathons are set up. The inaugural sbv IMPROVER datathon was held in Singapore on 23-24 September 2016. It allowed bioinformaticians and data scientists to consolidate their ideas and work on the most promising methods as teams, after having initially reflected on the problem on their own. The outcome is a set of visualization and analysis methods that will be shared with the scientific community via the Garuda platform, an open connectivity platform that provides a framework to navigate through different applications, databases and services in biology and medicine. We will present the results we obtained when analyzing data with our network-based method, and introduce a datathon that will take place in Japan to encourage the analysis of the same datasets with other methods to allow for the consolidation of conclusions.Keywords: big data interpretation, datathon, systems toxicology, verification
Procedia PDF Downloads 278677 Slave Museums and a Site of Democratic Pedagogy: Engagement, Healing and Tolerance
Authors: Elaine Stavro
Abstract:
In our present world where acts of incivility, intolerance and anger towards minority communities is on the rise, the ways museum practices cultivate ethical generosity is of interest. Democratic theorists differ as to how they believe respect can be generated through active participation. Allowing minority communities a role in determining what artifacts will be displayed and how they will be displayed has been an important step in generating respect. In addition, the rise of indigenous museums, slave museums and curators who represent these communities, contribute to the communication of their history of oppression. These institutional practices have been supplemented by the handling of objects, recognition stories and multisensory exhibitions. Psychoanalysis, object relations theorists believe that the handling of objects: amenable objects and responsive listeners will trigger the expression of anomie, alienation and traumatizing experiences. Not only memorializing but engaging with one’s lose in a very personal way can facilitate the process of mourning. Manchester Museum (UK) gathered together Somalian refugees, who in the process of handling their own objects and those offered at the museum, began to tell their stories. Democratic theorists (especially affect theorists or vital materialists or Actor Network theorists) believe that things can be social actants- material objects have agentic capacities that humans should align with. In doing so, they challenge social constructivism that attributes power to interpreted things, but like them they assume an openness or responsiveness to Otherness can be cultivated. Rich sensory experiences, corporeal engagement (devices that involve bodily movement or objects that involve handling) auditory experiences (songs) all contribute to improve one’s responsiveness and openness to Others. This paper will focus specifically on slave museums/ and exhibits in the U.K, the USA., South Africa to explore and evaluate their democratic strategies in cultivating tolerant practices via the various democratic avenues outlined above.Keywords: democratic pedagogy, slave exhibitions, affect/emotion, object handling
Procedia PDF Downloads 461676 Numerical Simulation of a Single Cell Passing through a Narrow Slit
Authors: Lanlan Xiao, Yang Liu, Shuo Chen, Bingmei Fu
Abstract:
Most cancer-related deaths are due to metastasis. Metastasis is a complex, multistep processes including the detachment of cancer cells from the primary tumor and the migration to distant targeted organs through blood and/or lymphatic circulations. During hematogenous metastasis, the emigration of tumor cells from the blood stream through the vascular wall into the tissue involves arrest in the microvasculature, adhesion to the endothelial cells forming the microvessel wall and transmigration to the tissue through the endothelial barrier termed as extravasation. The narrow slit between endothelial cells that line the microvessel wall is the principal pathway for tumor cell extravasation to the surrounding tissue. To understand this crucial step for tumor hematogenous metastasis, we used Dissipative Particle Dynamics method to investigate an individual cell passing through a narrow slit numerically. The cell membrane was simulated by a spring-based network model which can separate the internal cytoplasm and surrounding fluid. The effects of the cell elasticity, cell shape and cell surface area increase, and slit size on the cell transmigration through the slit were investigated. Under a fixed driven force, the cell with higher elasticity can be elongated more and pass faster through the slit. When the slit width decreases to 2/3 of the cell diameter, the spherical cell becomes jammed despite reducing its elasticity modulus by 10 times. However, transforming the cell from a spherical to ellipsoidal shape and increasing the cell surface area only by 3% can enable the cell to pass the narrow slit. Therefore the cell shape and surface area increase play a more important role than the cell elasticity in cell passing through the narrow slit. In addition, the simulation results indicate that the cell migration velocity decreases during entry but increases during exit of the slit, which is qualitatively in agreement with the experimental observation.Keywords: dissipative particle dynamics, deformability, surface area increase, cell migration
Procedia PDF Downloads 337675 Development of Doctoral Education in Armenia (1990 - 2023)
Authors: Atom Mkhitaryan, Astghik Avetisyan
Abstract:
We analyze the developments of doctoral education in Armenia since 1990 and the management process. Education and training of highly qualified personnel are increasingly seen as a fundamental platform that ensures the development of the state. Reforming the national institute for doctoral studies (aspirantura) is aimed at improving the quality of human resources in science, optimizing research topics in accordance with the priority areas of development of science and technology, increasing publication and innovative activities, bringing national science and research closer to the world level and achieving international recognition. We present a number of defended dissertations in Armenia during the last 30 years, the dynamics and the main trends of the development of the academic degree awarding system. We discuss the possible impact of reforming the system of training and certification of highly qualified personnel on the organization of third–level doctoral education (doctoral schools) and specialized / dissertation councils in Armenia. The results of the SWOT analysis of doctoral education and academic degree awarding processes in Armenia are shown. The article presents the main activities and projects aimed at using the advantages and strong points of the National Academy network in order to improve the quality of doctoral education and training. The paper explores the mechanisms of organizational, methodological and infrastructural support for research and innovation activities of doctoral students and young scientists. There are also suggested approaches to the organization of strong networking between research institutes and foreign universities for training and certification of highly qualified personnel. The authors define the role of ISEC in the management of doctoral studies and the establishment of a competitive third-level education for the sphere of research and development in Armenia.Keywords: doctoral studies, academic degree, PhD, certification, highly qualified personnel, dissertation, research and development, innovation, networking, management of doctoral school
Procedia PDF Downloads 66674 Urban Corridor Management Strategy Based on Intelligent Transportation System
Authors: Sourabh Jain, Sukhvir Singh Jain, Gaurav V. Jain
Abstract:
Intelligent Transportation System (ITS) is the application of technology for developing a user–friendly transportation system for urban areas in developing countries. The goal of urban corridor management using ITS in road transport is to achieve improvements in mobility, safety, and the productivity of the transportation system within the available facilities through the integrated application of advanced monitoring, communications, computer, display, and control process technologies, both in the vehicle and on the road. This paper attempts to present the past studies regarding several ITS available that have been successfully deployed in urban corridors of India and abroad, and to know about the current scenario and the methodology considered for planning, design, and operation of Traffic Management Systems. This paper also presents the endeavor that was made to interpret and figure out the performance of the 27.4 Km long study corridor having eight intersections and four flyovers. The corridor consisting of 6 lanes as well as 8 lanes divided road network. Two categories of data were collected on February 2016 such as traffic data (traffic volume, spot speed, delay) and road characteristics data (no. of lanes, lane width, bus stops, mid-block sections, intersections, flyovers). The instruments used for collecting the data were video camera, radar gun, mobile GPS and stopwatch. From analysis, the performance interpretations incorporated were identification of peak hours and off peak hours, congestion and level of service (LOS) at mid blocks, delay followed by the plotting speed contours and recommending urban corridor management strategies. From the analysis, it is found that ITS based urban corridor management strategies will be useful to reduce congestion, fuel consumption and pollution so as to provide comfort and efficiency to the users. The paper presented urban corridor management strategies based on sensors incorporated in both vehicles and on the roads.Keywords: congestion, ITS strategies, mobility, safety
Procedia PDF Downloads 445673 Characteristics of Aerosols Properties Over Different Desert-Influenced Aeronet Sites
Authors: Abou Bakr Merdji, Alaa Mhawish, Xiaofeng Xu, Chunsong Lu
Abstract:
The characteristics of optical and microphysical properties of aerosols near deserts are analyzed using 11 AErosol RObotic NETwork (AERONET) sites located in 6 major desert areas (the Sahara, Arabia, Thar, Karakum, Taklamakan, and Gobi) between 1998 and 2021. The regional mean of Aerosol Optical Depth (AOD) (coarse AOD (CAOD)) are 0.44 (0.187), 0.38 (0.26), 0.35 (0.24), 0.23 (0.11), 0.20 (0.14), 0.10 (0.05) in the Thar, Arabian, Sahara, Karakum, Taklamakan and Gobi Deserts respectively, while an opposite for AE and Fine Mode Fraction (FMF). Higher extinctions are associated with larger particles (dust) over all the main desert regions. This is shown by the almost inversely proportional variations of AOD and CAOD compared with AE and FMF. Coarse particles contribute the most to the total AOD over the Sahara Desert compared to those in the other deserts all year round. Related to the seasonality of dust events, the maximum AOD (CAOD) generally appears in summer and spring, while the minimum is in winter. The mean values of absorbing AOD (AAOD), Absorbing AE (AAE), and the Single Scattering Albedo (SSA) for all sites ranged from 0.017 to 0.037, from 1.16 to 2.81 and from 0.844 to 0.944, respectively. Generally, the highest absorbing aerosol load are observed over the Thar, followed by the Karakum, the Sahara, the Gobi, and then the Taklamakan Deserts, while the largest absorbing particles are observed in the Sahara followed by Arabia, Thar, Karakum, Gobi, and the smallest over the Taklamakan Desert. Similar absorption qualities are observed over the Sahara, Arabia, Thar, and Karakum Deserts, with SSA values varying between 0.90 and 0.91, whereas the most and least absorbing particles are observed at the Taklamakan and the Gobi Deserts, respectively. The seasonal AAODs are distinctly different over the deserts, with parts of Sahara and Arabia, and the Dalanzadgad sites experiencing the maximum in summer, the Southern Sahara, Western Arabia, Jaipur, and Dushanbe in winter, while the Eastern Arabia and the Muztagh Ata in autumn. AAOD and SSA spectra are consistent with dust-dominated conditions that resulted from aerosol typing (dust and polluted dust) at most deserts, with a possible presence of other absorbing particles apart from dust at Arabia, the Taklamakan, and the Gobi Desert sites.Keywords: sahara, AERONET, desert, dust belt, aerosols, optical properties
Procedia PDF Downloads 85672 Engaging the Terrorism Problematique in Africa: Discursive and Non-Discursive Approaches to Counter Terrorism
Authors: Cecil Blake, Tolu Kayode-Adedeji, Innocent Chiluwa, Charles Iruonagbe
Abstract:
National, regional and international security threats have dominated the twenty-first century thus far. Insurgencies that utilize “terrorism” as their primary strategy pose the most serious threat to global security. States in turn adopt terrorist strategies to resist and even defeat insurgents who invoke the legitimacy of statehood to justify their action. In short, the era is dominated by the use of terror tactics by state and non-state actors. Globally, there is a powerful network of groups involved in insurgencies using Islam as the bastion for their cause. In Africa, there are Boko Haram, Al Shabaab and Al Qaeda in the Maghreb representing Islamic groups utilizing terror strategies and tactics to prosecute their wars. The task at hand is to discover and to use multiple ways of handling the present security threats, including novel approaches to policy formulation, implementation, monitoring and evaluation that would pay significant attention to the important role of culture and communication strategies germane for discursive means of conflict resolution. In other to achieve this, the proposed research would address inter alia, root causes of insurgences that predicate their mission on Islamic tenets particularly in Africa; discursive and non-discursive counter-terrorism approaches fashioned by African governments, continental supra-national and regional organizations, recruitment strategies by major non-sate actors in Africa that rely solely on terrorist strategies and tactics and sources of finances for the groups under study. A major anticipated outcome of this research is a contribution to answers that would lead to the much needed stability required for development in African countries experiencing insurgencies carried out by the use of patterned terror strategies and tactics. The nature of the research requires the use of triangulation as the methodological tool.Keywords: counter-terrorism, discourse, Nigeria, security, terrorism
Procedia PDF Downloads 486671 American Sign Language Recognition System
Authors: Rishabh Nagpal, Riya Uchagaonkar, Venkata Naga Narasimha Ashish Mernedi, Ahmed Hambaba
Abstract:
The rapid evolution of technology in the communication sector continually seeks to bridge the gap between different communities, notably between the deaf community and the hearing world. This project develops a comprehensive American Sign Language (ASL) recognition system, leveraging the advanced capabilities of convolutional neural networks (CNNs) and vision transformers (ViTs) to interpret and translate ASL in real-time. The primary objective of this system is to provide an effective communication tool that enables seamless interaction through accurate sign language interpretation. The architecture of the proposed system integrates dual networks -VGG16 for precise spatial feature extraction and vision transformers for contextual understanding of the sign language gestures. The system processes live input, extracting critical features through these sophisticated neural network models, and combines them to enhance gesture recognition accuracy. This integration facilitates a robust understanding of ASL by capturing detailed nuances and broader gesture dynamics. The system is evaluated through a series of tests that measure its efficiency and accuracy in real-world scenarios. Results indicate a high level of precision in recognizing diverse ASL signs, substantiating the potential of this technology in practical applications. Challenges such as enhancing the system’s ability to operate in varied environmental conditions and further expanding the dataset for training were identified and discussed. Future work will refine the model’s adaptability and incorporate haptic feedback to enhance the interactivity and richness of the user experience. This project demonstrates the feasibility of an advanced ASL recognition system and lays the groundwork for future innovations in assistive communication technologies.Keywords: sign language, computer vision, vision transformer, VGG16, CNN
Procedia PDF Downloads 44670 Enhancement of Long Term Peak Demand Forecast in Peninsular Malaysia Using Hourly Load Profile
Authors: Nazaitul Idya Hamzah, Muhammad Syafiq Mazli, Maszatul Akmar Mustafa
Abstract:
The peak demand forecast is crucial to identify the future generation plant up needed in the long-term capacity planning analysis for Peninsular Malaysia as well as for the transmission and distribution network planning activities. Currently, peak demand forecast (in Mega Watt) is derived from the generation forecast by using load factor assumption. However, a forecast using this method has underperformed due to the structural changes in the economy, emerging trends and weather uncertainty. The dynamic changes of these drivers will result in many possible outcomes of peak demand for Peninsular Malaysia. This paper will look into the independent model of peak demand forecasting. The model begins with the selection of driver variables to capture long-term growth. This selection and construction of variables, which include econometric, emerging trend and energy variables, will have an impact on the peak forecast. The actual framework begins with the development of system energy and load shape forecast by using the system’s hourly data. The shape forecast represents the system shape assuming all embedded technology and use patterns to continue in the future. This is necessary to identify the movements in the peak hour or changes in the system load factor. The next step would be developing the peak forecast, which involves an iterative process to explore model structures and variables. The final step is combining the system energy, shape, and peak forecasts into the hourly system forecast then modifying it with the forecast adjustments. Forecast adjustments are among other sales forecasts for electric vehicles, solar and other adjustments. The framework will result in an hourly forecast that captures growth, peak usage and new technologies. The advantage of this approach as compared to the current methodology is that the peaks capture new technology impacts that change the load shape.Keywords: hourly load profile, load forecasting, long term peak demand forecasting, peak demand
Procedia PDF Downloads 174669 The Use of Space Syntax in Urban Transportation Planning and Evaluation: Limits and Potentials
Authors: Chuan Yang, Jing Bie, Yueh-Lung Lin, Zhong Wang
Abstract:
Transportation planning is an academic integration discipline combining research and practice with the aim of mobility and accessibility improvements at both strategic-level policy-making and operational dimensions of practical planning. Transportation planning could build the linkage between traffic and social development goals, for instance, economic benefits and environmental sustainability. The transportation planning analysis and evaluation tend to apply empirical quantitative approaches with the guidance of the fundamental principles, such as efficiency, equity, safety, and sustainability. Space syntax theory has been applied in the spatial distribution of pedestrian movement or vehicle flow analysis, however rare has been written about its application in transportation planning. The correlated relationship between the variables of space syntax analysis and authentic observations have declared that the urban configurations have a significant effect on urban dynamics, for instance, land value, building density, traffic, crime. This research aims to explore the potentials of applying Space Syntax methodology to evaluate urban transportation planning through studying the effects of urban configuration on cities transportation performance. By literature review, this paper aims to discuss the effects that urban configuration with different degrees of integration and accessibility have on three elementary components of transportation planning - transportation efficiency, transportation safety, and economic agglomeration development - via intensifying and stabilising the nature movements generated by the street network. And then the potential and limits of Space Syntax theory to study the performance of urban transportation and transportation planning would be discussed in the paper. In practical terms, this research will help future research explore the effects of urban design on transportation performance, and identify which patterns of urban street networks would allow for most efficient and safe transportation performance with higher economic benefits.Keywords: transportation planning, space syntax, economic agglomeration, transportation efficiency, transportation safety
Procedia PDF Downloads 198668 Development of an Atmospheric Radioxenon Detection System for Nuclear Explosion Monitoring
Authors: V. Thomas, O. Delaune, W. Hennig, S. Hoover
Abstract:
Measurement of radioactive isotopes of atmospheric xenon is used to detect, locate and identify any confined nuclear tests as part of the Comprehensive Nuclear Test-Ban Treaty (CTBT). In this context, the Alternative Energies and French Atomic Energy Commission (CEA) has developed a fixed device to continuously measure the concentration of these fission products, the SPALAX process. During its atmospheric transport, the radioactive xenon will undergo a significant dilution between the source point and the measurement station. Regarding the distance between fixed stations located all over the globe, the typical volume activities measured are near 1 mBq m⁻³. To avoid the constraints induced by atmospheric dilution, the development of a mobile detection system is in progress; this system will allow on-site measurements in order to confirm or infringe a suspicious measurement detected by a fixed station. Furthermore, this system will use beta/gamma coincidence measurement technique in order to drastically reduce environmental background (which masks such activities). The detector prototype consists of a gas cell surrounded by two large silicon wafers, coupled with two square NaI(Tl) detectors. The gas cell has a sample volume of 30 cm³ and the silicon wafers are 500 µm thick with an active surface area of 3600 mm². In order to minimize leakage current, each wafer has been segmented into four independent silicon pixels. This cell is sandwiched between two low background NaI(Tl) detectors (70x70x40 mm³ crystal). The expected Minimal Detectable Concentration (MDC) for each radio-xenon is in the order of 1-10 mBq m⁻³. Three 4-channels digital acquisition modules (Pixie-NET) are used to process all the signals. Time synchronization is ensured by a dedicated PTP-network, using the IEEE 1588 Precision Time Protocol. We would like to present this system from its simulation to the laboratory tests.Keywords: beta/gamma coincidence technique, low level measurement, radioxenon, silicon pixels
Procedia PDF Downloads 126667 Radar Track-based Classification of Birds and UAVs
Authors: Altilio Rosa, Chirico Francesco, Foglia Goffredo
Abstract:
In recent years, the number of Unmanned Aerial Vehicles (UAVs) has significantly increased. The rapid development of commercial and recreational drones makes them an important part of our society. Despite the growing list of their applications, these vehicles pose a huge threat to civil and military installations: detection, classification and neutralization of such flying objects become an urgent need. Radar is an effective remote sensing tool for detecting and tracking flying objects, but scenarios characterized by the presence of a high number of tracks related to flying birds make especially challenging the drone detection task: operator PPI is cluttered with a huge number of potential threats and his reaction time can be severely affected. Flying birds compared to UAVs show similar velocity, RADAR cross-section and, in general, similar characteristics. Building from the absence of a single feature that is able to distinguish UAVs and birds, this paper uses a multiple features approach where an original feature selection technique is developed to feed binary classifiers trained to distinguish birds and UAVs. RADAR tracks acquired on the field and related to different UAVs and birds performing various trajectories were used to extract specifically designed target movement-related features based on velocity, trajectory and signal strength. An optimization strategy based on a genetic algorithm is also introduced to select the optimal subset of features and to estimate the performance of several classification algorithms (Neural network, SVM, Logistic regression…) both in terms of the number of selected features and misclassification error. Results show that the proposed methods are able to reduce the dimension of the data space and to remove almost all non-drone false targets with a suitable classification accuracy (higher than 95%).Keywords: birds, classification, machine learning, UAVs
Procedia PDF Downloads 224666 Numerical Modelling of Shear Zone and Its Implications on Slope Instability at Letšeng Diamond Open Pit Mine, Lesotho
Authors: M. Ntšolo, D. Kalumba, N. Lefu, G. Letlatsa
Abstract:
Rock mass damage due to shear tectonic activity has been investigated largely in geoscience where fluid transport is of major interest. However, little has been studied on the effect of shear zones on rock mass behavior and its impact on stability of rock slopes. At Letšeng Diamonds open pit mine in Lesotho, the shear zone composed of sheared kimberlite material, calcite and altered basalt is forming part of the haul ramp into the main pit cut 3. The alarming rate at which the shear zone is deteriorating has triggered concerns about both local and global stability of pit the walls. This study presents the numerical modelling of the open pit slope affected by shear zone at Letšeng Diamond Mine (LDM). Analysis of the slope involved development of the slope model by using a two-dimensional finite element code RS2. Interfaces between shear zone and host rock were represented by special joint elements incorporated in the finite element code. The analysis of structural geological mapping data provided a good platform to understand the joint network. Major joints including shear zone were incorporated into the model for simulation. This approach proved successful by demonstrating that continuum modelling can be used to evaluate evolution of stresses, strain, plastic yielding and failure mechanisms that are consistent with field observations. Structural control due to geological shear zone structure proved to be important in its location, size and orientation. Furthermore, the model analyzed slope deformation and sliding possibility along shear zone interfaces. This type of approach can predict shear zone deformation and failure mechanism, hence mitigation strategies can be deployed for safety of human lives and property within mine pits.Keywords: numerical modeling, open pit mine, shear zone, slope stability
Procedia PDF Downloads 299665 Measurement of Ionospheric Plasma Distribution over Myanmar Using Single Frequency Global Positioning System Receiver
Authors: Win Zaw Hein, Khin Sandar Linn, Su Su Yi Mon, Yoshitaka Goto
Abstract:
The Earth ionosphere is located at the altitude of about 70 km to several 100 km from the ground, and it is composed of ions and electrons called plasma. In the ionosphere, these plasma makes delay in GPS (Global Positioning System) signals and reflect in radio waves. The delay along the signal path from the satellite to the receiver is directly proportional to the total electron content (TEC) of plasma, and this delay is the largest error factor in satellite positioning and navigation. Sounding observation from the top and bottom of the ionosphere was popular to investigate such ionospheric plasma for a long time. Recently, continuous monitoring of the TEC using networks of GNSS (Global Navigation Satellite System) observation stations, which are basically built for land survey, has been conducted in several countries. However, in these stations, multi-frequency support receivers are installed to estimate the effect of plasma delay using their frequency dependence and the cost of multi-frequency support receivers are much higher than single frequency support GPS receiver. In this research, single frequency GPS receiver was used instead of expensive multi-frequency GNSS receivers to measure the ionospheric plasma variation such as vertical TEC distribution. In this measurement, single-frequency support ublox GPS receiver was used to probe ionospheric TEC. The location of observation was assigned at Mandalay Technological University in Myanmar. In the method, the ionospheric TEC distribution is represented by polynomial functions for latitude and longitude, and parameters of the functions are determined by least-squares fitting on pseudorange data obtained at a known location under an assumption of thin layer ionosphere. The validity of the method was evaluated by measurements obtained by the Japanese GNSS observation network called GEONET. The performance of measurement results using single-frequency of GPS receiver was compared with the results by dual-frequency measurement.Keywords: ionosphere, global positioning system, GPS, ionospheric delay, total electron content, TEC
Procedia PDF Downloads 139664 Spatial Element Importance and Its Relation to Characters’ Emotions and Self Awareness in Michela Murgia’s Collection of Short Stories Tre Ciotole. Rituali per Un Anno DI Crisi
Authors: Nikica Mihaljević
Abstract:
Published in 2023, "Tre ciotole. Rituali per un anno di crisi" is a collection of short stories completely disconnected from one another in regard to topics and the representation of characters. However, these short stories complete and somehow continue each other in a particular way. The book happens to be Murgia's last book, as the author died a few months later after the book's publication and it appears as a kind of summary of all her previous literary works. Namely, in her previous publications, Murgia already stressed certain characters' particularities, such as solitude and alienation from others, which are at the center of attention in this literary work, too. What all the stories present in "Tre ciotole" have in common is the dealing with characters' identity and self-awareness through the challenges they confront and the way the characters live their emotions in relation to the surrounding space. Although the challenges seem similar, the spatial element around the characters is different, but it confirms each time that characters' emotions, and, consequently, their self-awareness, can be formed and built only through their connection and relation to the surrounding space. In that way, the reader creates an imaginary network of complex relations among characters in all the short stories, which gives him/her the opportunity to search for a way to break out of the usual patterns that tend to be repeated while characters focus on building self-awareness. The aim of the paper is to determine and analyze the role of spatial elements in the creation of characters' emotions and in the process of self-awareness. As the spatial element changes or gets transformed and/or substituted, in the same way, we notice the arise of the unconscious desire for self-harm in the characters, which damages their self-awareness. Namely, the characters face a crisis that they cannot control by inventing other types of crises that can be controlled. That happens to be their way of acting in order to find the way out of the identity crisis. Consequently, we expect that the results of the analysis point out the similarities in the short stories in characters' depiction as well as to show the extent to which the characters' identities depend on the surrounding space in each short story. In this way, the results will highlight the importance of spatial elements in characters' identity formation in Michela Murgia's short stories and also summarize the importance of the whole Murgia's literary opus.Keywords: Italian literature, short stories, environment, spatial element, emotions, characters
Procedia PDF Downloads 57663 An Artificial Intelligence Framework to Forecast Air Quality
Authors: Richard Ren
Abstract:
Air pollution is a serious danger to international well-being and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Air pollution is a serious danger to international wellbeing and economies - it will kill an estimated 7 million people every year, costing world economies $2.6 trillion by 2060 due to sick days, healthcare costs, and reduced productivity. In the United States alone, 60,000 premature deaths are caused by poor air quality. For this reason, there is a crucial need to develop effective methods to forecast air quality, which can mitigate air pollution’s detrimental public health effects and associated costs by helping people plan ahead and avoid exposure. The goal of this study is to propose an artificial intelligence framework for predicting future air quality based on timing variables (i.e. season, weekday/weekend), future weather forecasts, as well as past pollutant and air quality measurements. The proposed framework utilizes multiple machine learning algorithms (logistic regression, random forest, neural network) with different specifications and averages the results of the three top-performing models to eliminate inaccuracies, weaknesses, and biases from any one individual model. Over time, the proposed framework uses new data to self-adjust model parameters and increase prediction accuracy. To demonstrate its applicability, a prototype of this framework was created to forecast air quality in Los Angeles, California using datasets from the RP4 weather data repository and EPA pollutant measurement data. The results showed good agreement between the framework’s predictions and real-life observations, with an overall 92% model accuracy. The combined model is able to predict more accurately than any of the individual models, and it is able to reliably forecast season-based variations in air quality levels. Top air quality predictor variables were identified through the measurement of mean decrease in accuracy. This study proposed and demonstrated the efficacy of a comprehensive air quality prediction framework leveraging multiple machine learning algorithms to overcome individual algorithm shortcomings. Future enhancements should focus on expanding and testing a greater variety of modeling techniques within the proposed framework, testing the framework in different locations, and developing a platform to automatically publish future predictions in the form of a web or mobile application. Accurate predictions from this artificial intelligence framework can in turn be used to save and improve lives by allowing individuals to protect their health and allowing governments to implement effective pollution control measures.Keywords: air quality prediction, air pollution, artificial intelligence, machine learning algorithms
Procedia PDF Downloads 130662 Verification and Validation of Simulated Process Models of KALBR-SIM Training Simulator
Authors: T. Jayanthi, K. Velusamy, H. Seetha, S. A. V. Satya Murty
Abstract:
Verification and Validation of Simulated Process Model is the most important phase of the simulator life cycle. Evaluation of simulated process models based on Verification and Validation techniques checks the closeness of each component model (in a simulated network) with the real system/process with respect to dynamic behaviour under steady state and transient conditions. The process of Verification and validation helps in qualifying the process simulator for the intended purpose whether it is for providing comprehensive training or design verification. In general, model verification is carried out by comparison of simulated component characteristics with the original requirement to ensure that each step in the model development process completely incorporates all the design requirements. Validation testing is performed by comparing the simulated process parameters to the actual plant process parameters either in standalone mode or integrated mode. A Full Scope Replica Operator Training Simulator for PFBR - Prototype Fast Breeder Reactor has been developed at IGCAR, Kalpakkam, INDIA named KALBR-SIM (Kalpakkam Breeder Reactor Simulator) wherein the main participants are engineers/experts belonging to Modeling Team, Process Design and Instrumentation and Control design team. This paper discusses the Verification and Validation process in general, the evaluation procedure adopted for PFBR operator training Simulator, the methodology followed for verifying the models, the reference documents and standards used etc. It details out the importance of internal validation by design experts, subsequent validation by external agency consisting of experts from various fields, model improvement by tuning based on expert’s comments, final qualification of the simulator for the intended purpose and the difficulties faced while co-coordinating various activities.Keywords: Verification and Validation (V&V), Prototype Fast Breeder Reactor (PFBR), Kalpakkam Breeder Reactor Simulator (KALBR-SIM), steady state, transient state
Procedia PDF Downloads 266661 Planing the Participation of Units Bound to Demand Response Programs with Regard to Ancillary Services in the PQ Power Market
Authors: Farnoosh Davarian
Abstract:
The present research focuses on organizing the cooperation of units constrained by demand response (DR) programs, considering ancillary services in the P-Q power market. Moreover, it provides a comprehensive exploration of the effects of demand reduction and redistribution across several predefined scenarios (in three pre-designed demand response programs, for example, ranging from 5% to 20%) on system voltage and losses in a smart distribution system (in the studied network, distributed energy resources (DERs) such as synchronous distributed generators and wind turbines offer their active and reactive power for the proposed market).GAMS, a specialized software for high-powered modeling, is used for optimizing linear, nonlinear, and integer programming challenges. GAMS modeling is separate from its solution method, which is a notable feature. Thus, by providing changes in the solver, it is possible to solve the model using various methods (linear, nonlinear, integer, etc.). Finally, the combined active and reactive market challenge in smart distribution systems, considering renewable distributed sources and demand response programs in GAMS, will be evaluated. The active and reactive power trading by the distribution company is carried out in the wholesale market. What is demanded is active power. By using the buy-back/payment program, it is possible for responsive loads or aggregators to participate in the market. The objective function of the proposed market is to minimize the price of active and reactive power for DERs and distribution companies and the penalty cost for CO2 emissions and the cost of the buy-back/payment program. In this research, the objective function is to minimize the cost of active and reactive power from distributed generation sources and distribution companies, the cost of carbon dioxide emissions, and the cost of the buy-back/payment program. The effectiveness of the proposed method has been evaluated in a case study.Keywords: consumer behavior, demand response, pollution cost, combined active and reactive market
Procedia PDF Downloads 10660 Biotechnology Sector in the Context of National Innovation System: The Case of Norway
Authors: Parisa Afshin, Terje Grønning
Abstract:
Norway, similar to many other countries, has set the focus of its policies in creating new strong and highly innovative sectors in recent years, as the oil and gas sector profitability is declining. Biotechnology sector in Norway has a great potential, especially in marine-biotech and cancer medicine. However, Norway being a periphery faces especial challenges in the path of creating internationally well-known biotech sector and an international knowledge hub. The aim of this article is to analyze the progress of the Norwegian biotechnology industry, its pathway to build up an innovation network and conduct collaborative innovation based on its initial conditions and its own advantage and disadvantages. The findings have important implications not only for politicians and academic in understanding the infrastructure of biotechnology sector in the country, but it has important lessons for other periphery countries or regions aiming in creating strong biotechnology sector and catching up with the strong internationally-recognized regions. Data and methodology: To achieve the main goal of this study, information has been collected via secondary resources such as web pages and annual reports published by the officials and mass media along with interviews were used. The data were collected with the goal to shed light on a brief history and current status of Norway biotechnology sector, as well as geographic distribution of biotech industry, followed by the role of academic and industry collaboration and public policies in Norway biotech. As knowledge is the key input in innovation, knowledge perspective of the system such as knowledge flow in the sector regarding the national and regional innovation system has been studied. Primary results: The internationalization has been an important element in development of periphery regions' innovativeness enabling them to overcome their weakness while putting more weight on the importance of regional policies. Following such findings, suggestions on policy decision and international collaboration, regarding national and regional system of innovation, has been offered as means of promoting strong innovative sector.Keywords: biotechnology sector, knowledge-based industry, national innovation system, regional innovation system
Procedia PDF Downloads 226659 A Retrospective Analysis of the Impact of the Choosing Wisely Canada Campaign on Emergency Department Imaging Utilization for Head Injuries
Authors: Sameer Masood, Lucas Chartier
Abstract:
Head injuries are a commonly encountered presentation in emergency departments (ED) and the Choosing Wisely Canada (CWC) campaign was released in June 2015 in an attempt to decrease imaging utilization for patients with minor head injuries. The impact of the CWC campaign on imaging utilization for head injuries has not been explored in the ED setting. In our study, we describe the characteristics of patients with head injuries presenting to a tertiary care academic ED and the impact of the CWC campaign on CT head utilization. This retrospective cohort study used linked databases from the province of Ontario, Canada to assess emergency department visits with a primary diagnosis of head injury made between June 1, 2014 and Aug 31, 2016 at the University Health Network in Toronto, Canada. We examined the number of visits during the study period, the proportion of patients that had a CT head performed before and after the release of the CWC campaign, as well as mode of arrival, and disposition. There were 4,322 qualifying visits at our site during the study period. The median presenting age was 44.12 years (IQR 27.83,67.45), the median GCS was 15 (IQR 15,15) and the majority of patients presenting had intermediate acuity (CTAS 3). Overall, 43.17% of patients arrived via ambulance, 49.24 % of patients received a CT head and 10.46% of patients were admitted. Compared to patients presenting before the CWC campaign release, there was no significant difference in the rate of CT heads after the CWC (50.41% vs 47.68%, P = 0.07). There were also no significant differences between the two groups in mode of arrival (ambulance vs ambulatory) (42.94% vs 43.48%, P = 0.72) or admission rates (9.85% vs 11.26%, P = 0.15). However, more patients belonged to the high acuity groups (CTAS 1 or 2) in the post CWC campaign release group (12.98% vs 8.11% P <0.001). Visits for head injuries make up a significant proportion of total ED visits and approximately half of these patients receive CT imaging in the ED. The CWC campaign did not seem to impact imaging utilization for head injuries in the 14 months following its launch. Further efforts, including local quality improvement initiatives, are likely needed to increase adherence to its recommendation and reduce imaging utilization for head injuries.Keywords: choosing wisely, emergency department, head injury, quality improvement
Procedia PDF Downloads 227658 Geomorphometric Analysis of the Hydrologic and Topographic Parameters of the Katsina-Ala Drainage Basin, Benue State, Nigeria
Authors: Oyatayo Kehinde Taofik, Ndabula Christopher
Abstract:
Drainage basins are a central theme in the green economy. The rising challenges in flooding, erosion or sediment transport and sedimentation threaten the green economy. This has led to increasing emphasis on quantitative analysis of drainage basin parameters for better understanding, estimation and prediction of fluvial responses and, thus associated hazards or disasters. This can be achieved through direct measurement, characterization, parameterization, or modeling. This study applied the Remote Sensing and Geographic Information System approach of parameterization and characterization of the morphometric variables of Katsina – Ala basin using a 30 m resolution Shuttle Radar Topographic Mission (SRTM) Digital Elevation Model (DEM). This was complemented with topographic and hydrological maps of Katsina-Ala on a scale of 1:50,000. Linear, areal and relief parameters were characterized. The result of the study shows that Ala and Udene sub-watersheds are 4th and 5th order basins, respectively. The stream network shows a dendritic pattern, indicating homogeneity in texture and a lack of structural control in the study area. Ala and Udene sub-watersheds have the following values for elongation ratio, circularity ratio, form factor and relief ratio: 0.48 / 0.39 / 0.35/ 9.97 and 0.40 / 0.35 / 0.32 / 6.0. They also have the following values for drainage texture and ruggedness index of 0.86 / 0.011 and 1.57 / 0.016. The study concludes that the two sub-watersheds are elongated, suggesting that they are susceptible to erosion and, thus higher sediment load in the river channels, which will dispose the watersheds to higher flood peaks. The study also concludes that the sub-watersheds have a very coarse texture, with good permeability of subsurface materials and infiltration capacity, which significantly recharge the groundwater. The study recommends that efforts should be put in place by the Local and State Governments to reduce the size of paved surfaces in these sub-watersheds by implementing a robust agroforestry program at the grass root level.Keywords: erosion, flood, mitigation, morphometry, watershed
Procedia PDF Downloads 91657 Census and Mapping of Oil Palms Over Satellite Dataset Using Deep Learning Model
Authors: Gholba Niranjan Dilip, Anil Kumar
Abstract:
Conduct of accurate reliable mapping of oil palm plantations and census of individual palm trees is a huge challenge. This study addresses this challenge and developed an optimized solution implemented deep learning techniques on remote sensing data. The oil palm is a very important tropical crop. To improve its productivity and land management, it is imperative to have accurate census over large areas. Since, manual census is costly and prone to approximations, a methodology for automated census using panchromatic images from Cartosat-2, SkySat and World View-3 satellites is demonstrated. It is selected two different study sites in Indonesia. The customized set of training data and ground-truth data are created for this study from Cartosat-2 images. The pre-trained model of Single Shot MultiBox Detector (SSD) Lite MobileNet V2 Convolutional Neural Network (CNN) from the TensorFlow Object Detection API is subjected to transfer learning on this customized dataset. The SSD model is able to generate the bounding boxes for each oil palm and also do the counting of palms with good accuracy on the panchromatic images. The detection yielded an F-Score of 83.16 % on seven different images. The detections are buffered and dissolved to generate polygons demarcating the boundaries of the oil palm plantations. This provided the area under the plantations and also gave maps of their location, thereby completing the automated census, with a fairly high accuracy (≈100%). The trained CNN was found competent enough to detect oil palm crowns from images obtained from multiple satellite sensors and of varying temporal vintage. It helped to estimate the increase in oil palm plantations from 2014 to 2021 in the study area. The study proved that high-resolution panchromatic satellite image can successfully be used to undertake census of oil palm plantations using CNNs.Keywords: object detection, oil palm tree census, panchromatic images, single shot multibox detector
Procedia PDF Downloads 161