Search results for: chain code normalization
Commenced in January 2007
Frequency: Monthly
Edition: International
Paper Count: 3398

Search results for: chain code normalization

2288 Recognizing an Individual, Their Topic of Conversation and Cultural Background from 3D Body Movement

Authors: Gheida J. Shahrour, Martin J. Russell

Abstract:

The 3D body movement signals captured during human-human conversation include clues not only to the content of people’s communication but also to their culture and personality. This paper is concerned with automatic extraction of this information from body movement signals. For the purpose of this research, we collected a novel corpus from 27 subjects, arranged them into groups according to their culture. We arranged each group into pairs and each pair communicated with each other about different topics. A state-of-art recognition system is applied to the problems of person, culture, and topic recognition. We borrowed modeling, classification, and normalization techniques from speech recognition. We used Gaussian Mixture Modeling (GMM) as the main technique for building our three systems, obtaining 77.78%, 55.47%, and 39.06% from the person, culture, and topic recognition systems respectively. In addition, we combined the above GMM systems with Support Vector Machines (SVM) to obtain 85.42%, 62.50%, and 40.63% accuracy for person, culture, and topic recognition respectively. Although direct comparison among these three recognition systems is difficult, it seems that our person recognition system performs best for both GMM and GMM-SVM, suggesting that inter-subject differences (i.e. subject’s personality traits) are a major source of variation. When removing these traits from culture and topic recognition systems using the Nuisance Attribute Projection (NAP) and the Intersession Variability Compensation (ISVC) techniques, we obtained 73.44% and 46.09% accuracy from culture and topic recognition systems respectively.

Keywords: person recognition, topic recognition, culture recognition, 3D body movement signals, variability compensation

Procedia PDF Downloads 539
2287 Understanding Stock-Out of Pharmaceuticals in Timor-Leste: A Case Study in Identifying Factors Impacting on Pharmaceutical Quantification in Timor-Leste

Authors: Lourenco Camnahas, Eileen Willis, Greg Fisher, Jessie Gunson, Pascale Dettwiller, Charlene Thornton

Abstract:

Stock-out of pharmaceuticals is a common issue at all level of health services in Timor-Leste, a small post-conflict country. This lead to the research questions: what are the current methods used to quantify pharmaceutical supplies; what factors contribute to the on-going pharmaceutical stock-out? The study examined factors that influence the pharmaceutical supply chain system. Methodology: Privett and Goncalvez dependency model has been adopted for the design of the qualitative interviews. The model examines pharmaceutical supply chain management at three management levels: management of individual pharmaceutical items, health facilities, and health systems. The interviews were conducted in order to collect information on inventory management, logistics management information system (LMIS) and the provision of pharmaceuticals. Andersen' behavioural model for healthcare utilization also informed the interview schedule, specifically factors linked to environment (healthcare system and external environment) and the population (enabling factors). Forty health professionals (bureaucrats, clinicians) and six senior officers from a United Nations Agency, a global multilateral agency and a local non-governmental organization were interviewed on their perceptions of factors (healthcare system/supply chain and wider environment) impacting on stock out. Additionally, policy documents for the entire healthcare system, along with population data were collected. Findings: An analysis using Pozzebon’s critical interpretation identified a range of difficulties within the system from poor coordination to failure to adhere to policy guidelines along with major difficulties with inventory management, quantification, forecasting, and budgetary constraints. Weak logistics management information system, lack of capacity in inventory management, monitoring and supervision are additional organizational factors that also contributed to the issue. There were various methods of quantification of pharmaceuticals applied in the government sector, and non-governmental organizations. Lack of reliable data is one of the major problems in the pharmaceutical provision. Global Fund has the best quantification methods fed by consumption data and malaria cases. There are other issues that worsen stock-out: political intervention, work ethic and basic infrastructure such as unreliable internet connectivity. Major issues impacting on pharmaceutical quantification have been identified. However, current data collection identified limitations within the Andersen model; specifically, a failure to take account of predictors in the healthcare system and the environment (culture/politics/social. The next step is to (a) compare models used by three non-governmental agencies with the government model; (b) to run the Andersen explanatory model for pharmaceutical expenditure for 2 to 5 drug items used by these three development partners in order to see how it correlates with the present model in terms of quantification and forecasting the needs; (c) to repeat objectives (a) and (b) using the government model; (d) to draw a conclusion about the strength.

Keywords: inventory management, pharmaceutical forecasting and quantification, pharmaceutical stock-out, pharmaceutical supply chain management

Procedia PDF Downloads 242
2286 When Digital Innovation Augments Cultural Heritage: An Innovation from Tradition Story

Authors: Danilo Pesce, Emilio Paolucci, Mariolina Affatato

Abstract:

Looking at the future and at the post-digital era, innovations commonly tend to dismiss the old and replace it with the new. The aim of this research is to study the role that digital innovation can play alongside the information chain within the traditional sectors and the subsequent value creation opportunities that actors and stakeholders can exploit. By drawing on a wide body of literature on innovation and strategic management and by conducting a case study on the cultural heritage industry, namely Google Arts & Culture, this study shows that technology augments complements, and amplifies the way people experience their cultural interests and experience. Furthermore, the study shows a process of democratization of art since museums can exploit new digital and virtual ways to distribute art globally. Moreover, new needs arose from the 2020 pandemic that hit and forced the world to a state of cultural fasting and caused a radical transformation of the paradigm online vs. onsite. Finally, the study highlights the capabilities that are emerging at different stages of the value chain, owing to the technological innovation available in the market. In essence, this research underlines the role of Google in allowing museums to reach users worldwide, thus unlocking new mechanisms of value creation in the cultural heritage industry. Likewise, this study points out how Google provides value to users by means of increasing the provision of artworks, improving the audience engagement and virtual experience, and providing new ways to access the online contents. The paper ends with a discussion of managerial and policy-making implications.

Keywords: big data, digital platforms, digital transformation, digitization, Google Arts and Culture, stakeholders’ interests

Procedia PDF Downloads 156
2285 Subcontractor Development Practices and Processes: A Conceptual Model for LEED Projects

Authors: Andrea N. Ofori-Boadu

Abstract:

The purpose is to develop a conceptual model of subcontractor development practices and processes that strengthen the integration of subcontractors into construction supply chain systems for improved subcontractor performance on Leadership in Energy and Environmental Design (LEED) certified building projects. The construction management of a LEED project has an important objective of meeting sustainability certification requirements. This is in addition to the typical project management objectives of cost, time, quality, and safety for traditional projects; and, therefore increases the complexity of LEED projects. Considering that construction management organizations rely heavily on subcontractors, poor performance on complex projects such as LEED projects has been largely attributed to the unsatisfactory preparation of subcontractors. Furthermore, the extensive use of unique and non-repetitive short term contracts limits the full integration of subcontractors into construction supply chains and hinders long-term cooperation and benefits that could enhance performance on construction projects. Improved subcontractor development practices are needed to better prepare and manage subcontractors, so that complex objectives can be met or exceeded. While supplier development and supply chain theories and practices for the manufacturing sector have been extensively investigated to address similar challenges, investigations in the construction sector are not that obvious. Consequently, the objective of this research is to investigate effective subcontractor development practices and processes to guide construction management organizations in their development of a strong network of high performing subcontractors. Drawing from foundational supply chain and supplier development theories in the manufacturing sector, a mixed interpretivist and empirical methodology is utilized to assess the body of knowledge within literature for conceptual model development. A self-reporting survey with five-point Likert scale items and open-ended questions is administered to 30 construction professionals to estimate their perceptions of the effectiveness of 37 practices, classified into five subcontractor development categories. Data analysis includes descriptive statistics, weighted means, and t-tests that guide the effectiveness ranking of practices and categories. The results inform the proposed three-phased LEED subcontractor development program model which focuses on preparation, development and implementation, and monitoring. Highly ranked LEED subcontractor pre-qualification, commitment, incentives, evaluation, and feedback practices are perceived as more effective, when compared to practices requiring more direct involvement and linkages between subcontractors and construction management organizations. This is attributed to unfamiliarity, conflicting interests, lack of trust, and resource sharing challenges. With strategic modifications, the recommended practices can be extended to other non-LEED complex projects. Additional research is needed to guide the development of subcontractor development programs that strengthen direct involvement between construction management organizations and their network of high performing subcontractors. Insights from this present research strengthen theoretical foundations to support future research towards more integrated construction supply chains. In the long-term, this would lead to increased performance, profits and client satisfaction.

Keywords: construction management, general contractor, supply chain, sustainable construction

Procedia PDF Downloads 109
2284 Fatigue Truck Modification Factor for Design Truck (CL-625)

Authors: Mohamad Najari, Gilbert Grondin, Marwan El-Rich

Abstract:

Design trucks in standard codes are selected based on the amount of damage they cause on structures-specifically bridges- and roads to represent the real traffic loads. Some limited numbers of trucks are run on a bridge one at a time and the damage on the bridge is recorded for each truck. One design track is also run on the same bridge “n” times -“n” is the number of trucks used previously- to calculate the damage of the design truck on the same bridge. To make these damages equal a reduction factor is needed for that specific design truck in the codes. As the limited number of trucks cannot be the exact representative of real traffic through the life of the structure, these reduction factors are not accurately calculated and they should be modified accordingly. Started on July 2004, the vehicle load data were collected in six weigh in motion (WIM) sites owned by Alberta Transportation for eight consecutive years. This database includes more than 200 million trucks. Having these data gives the opportunity to compare the effect of any standard fatigue trucks weigh and the real traffic load on the fatigue life of the bridges which leads to a modification for the fatigue truck factor in the code. To calculate the damage for each truck, the truck is run on the bridge, moment history of the detail under study is recorded, stress range cycles are counted, and then damage is calculated using available S-N curves. A 2000 lines FORTRAN code has been developed to perform the analysis and calculate the damages of the trucks in the database for all eight fatigue categories according to Canadian Institute of Steel Construction (CSA S-16). Stress cycles are counted using rain flow counting method. The modification factors for design truck (CL-625) are calculated for two different bridge configurations and ten span lengths varying from 1 m to 200 m. The two considered bridge configurations are single-span bridge and four span bridge. This was found to be sufficient and representative for a simply supported span, positive moment in end spans of bridges with two or more spans, positive moment in interior spans of three or more spans, and the negative moment at an interior support of multi-span bridges. The moment history of the mid span is recorded for single-span bridge and, exterior positive moment, interior positive moment, and support negative moment are recorded for four span bridge. The influence lines are expressed by a polynomial expression obtained from a regression analysis of the influence lines obtained from SAP2000. It is found that for design truck (CL-625) fatigue truck factor is varying from 0.35 to 0.55 depending on span lengths and bridge configuration. The detail results will be presented in the upcoming papers. This code can be used for any design trucks available in standard codes.

Keywords: bridge, fatigue, fatigue design truck, rain flow analysis, FORTRAN

Procedia PDF Downloads 520
2283 Production and Distribution Network Planning Optimization: A Case Study of Large Cement Company

Authors: Lokendra Kumar Devangan, Ajay Mishra

Abstract:

This paper describes the implementation of a large-scale SAS/OR model with significant pre-processing, scenario analysis, and post-processing work done using SAS. A large cement manufacturer with ten geographically distributed manufacturing plants for two variants of cement, around 400 warehouses serving as transshipment points, and several thousand distributor locations generating demand needed to optimize this multi-echelon, multi-modal transport supply chain separately for planning and allocation purposes. For monthly planning as well as daily allocation, the demand is deterministic. Rail and road networks connect any two points in this supply chain, creating tens of thousands of such connections. Constraints include the plant’s production capacity, transportation capacity, and rail wagon batch size constraints. Each demand point has a minimum and maximum for shipments received. Price varies at demand locations due to local factors. A large mixed integer programming model built using proc OPTMODEL decides production at plants, demand fulfilled at each location, and the shipment route to demand locations to maximize the profit contribution. Using base SAS, we did significant pre-processing of data and created inputs for the optimization. Using outputs generated by OPTMODEL and other processing completed using base SAS, we generated several reports that went into their enterprise system and created tables for easy consumption of the optimization results by operations.

Keywords: production planning, mixed integer optimization, network model, network optimization

Procedia PDF Downloads 66
2282 A Network Optimization Study of Logistics for Enhancing Emergency Preparedness in Asia-Pacific

Authors: Giuseppe Timperio, Robert De Souza

Abstract:

The combination of factors such as temperamental climate change, rampant urbanization of risk exposed areas, political and social instabilities, is posing an alarming base for the further growth of number and magnitude of humanitarian crises worldwide. Given the unique features of humanitarian supply chain such as unpredictability of demand in space, time, and geography, spike in the number of requests for relief items in the first days after the calamity, uncertain state of logistics infrastructures, large volumes of unsolicited low-priority items, a proactive approach towards design of disaster response operations is needed to achieve high agility in mobilization of emergency supplies in the immediate aftermath of the event. This paper is an attempt in that direction, and it provides decision makers with crucial strategic insights for a more effective network design for disaster response. Decision sciences and ICT are integrated to analyse the robustness and resilience of a prepositioned network of emergency strategic stockpiles for a real-life case about Indonesia, one of the most vulnerable countries in Asia-Pacific, with the model being built upon a rich set of quantitative data. At this aim, a network optimization approach was implemented, with several what-if scenarios being accurately developed and tested. Findings of this study are able to support decision makers facing challenges related with disaster relief chains resilience, particularly about optimal configuration of supply chain facilities and optimal flows across the nodes, while considering the network structure from an end-to-end in-country distribution perspective.

Keywords: disaster preparedness, humanitarian logistics, network optimization, resilience

Procedia PDF Downloads 172
2281 QR Technology to Automate Health Condition Detection in Payment System: A Case Study in the Kingdom of Saudi Arabia’s Schools

Authors: Amjad Alsulami, Farah Albishri, Kholod Alzubidi, Lama Almehemadi, Salma Elhag

Abstract:

Food allergy is a common and rising problem among children. Many students have their first allergic reaction at school, one of these is anaphylaxis, which can be fatal. This study discovered that several schools' processes lacked safety regulations and information on how to handle allergy issues and chronic diseases like diabetes where students were not supervised or monitored during the cafeteria purchasing process. There is no obvious prevention or effort in academic institutions when purchasing food containing allergens or negatively impacting the health status of students who suffer from chronic diseases. Students must always be stable to reflect positively on their educational development process. To address this issue, this paper uses a business reengineering process to propose the automation of the whole food-purchasing process, which will aid in detecting and avoiding allergic occurrences and preventing any side effects from eating foods that are conflicting with students' health. This may be achieved by designing a smart card with an embedded QR code that reveals which foods cause an allergic reaction in a student. A survey was distributed to determine and examine how the cafeteria will handle allergic children and whether any management or policy is applied in the school. Also, the survey findings indicate that the integration of QR technology into the food purchasing process would improve health condition detection. The suggested system would be beneficial to all parties, the family agreed, as they would ensure that their children didn't eat foods that were bad for their health. Moreover, by analyzing and simulating the as-is process and the suggested process the results demonstrate that there is an improvement in quality and time.

Keywords: QR code, smart card, food allergies, business process reengineering, health condition detection

Procedia PDF Downloads 74
2280 Appearance of Ciguatoxin Fish in Atlantic Europe Waters

Authors: J. Bravo, F. Cabrera Suárez, B. Vega, L. Román, M. Martel, F. Acosta

Abstract:

Ciguatera fish poisoning (CFP) is the most common non-bacterial intoxication in the world caused by ingestion of fish with bio-accumulated ciguatoxins (CTXs). It is typical in tropical and subtropical areas, mainly affecting the Caribbean Sea, Polynesia and other areas in the Pacific and Indian Oceans. Interest in Europe by the CFP is increasing in recent years as more and more cases in European hospitals are appearing, usually by people who have consumed ciguatoxin imported fish or have travelled to areas of risk for this poisoning. Since 2004 a series of poisonings raised the question of a possible occurrence of ciguatoxin in Europe, especially in the area of Macaronesia in the East Atlantic temperate zone. Furthermore, some studies have identified the presence of Gambierdiscus spp. in waters surrounding the Canary Islands and Madeira, a toxic dinoflagellate related to this poisoning. The toxin accumulates and concentrates through the food chain and affects to the end of the chain, the human consumer. Fish were collected from the Canary Islands waters and the toxin has been extracted and purified by using acetone and liquid/liquid partition in order to eliminate the excess of fatty acids that may interfere with the detection of the toxin. The fish extracts were inoculated in Neuroblastoma (neuro-2a) cells. After 24-h cell viability was used as an endpoint for cytotoxic effects measurement. Since 2011 our laboratory is collecting data for species such Seriola spp., Epinephelus spp., Makaira spp., Pomatomus spp., Xiphias spp., and Acantocybium spp., from all islands and including the sports fishing and professional activities, we obtained a 8% of fish that have ciguatoxin in their muscle. With these results, we conclude that the island where fishing and fish size affects the probability of catching a fish with the ciguatoxin.

Keywords: Canary Islands, ciguatera fish poisoning, ciguatoxin, Europe

Procedia PDF Downloads 343
2279 Design and Development of Tandem Dynamometer for Testing and Validation of Motor Performance Parameters

Authors: Vedansh More, Lalatendu Bal, Ronak Panchal, Atharva Kulkarni

Abstract:

The project aims at developing a cost-effective test bench capable of testing and validating the complete powertrain package of an electric vehicle. Emrax 228 high voltage synchronous motor was selected as the prime mover for study. A tandem type dynamometer comprising of two loading methods; inertial, using standard inertia rollers and absorptive, using a separately excited DC generator with resistive coils was developed. The absorptive loading of the prime mover was achieved by implementing a converter circuit through which duty of the input field voltage level was controlled. This control was efficacious in changing the magnetic flux and hence the generated voltage which was ultimately dropped across resistive coils assembled in a load bank with all parallel configuration. The prime mover and loading elements were connected via a chain drive with a 2:1 reduction ratio which allows flexibility in placement of components and a relaxed rating of the DC generator. The development will aid in determination of essential characteristics like torque-RPM, power-RPM, torque factor, RPM factor, heat loads of devices and battery pack state of charge efficiency but also provides a significant financial advantage over existing versions of dynamometers with its cost-effective solution.

Keywords: absorptive load, chain drive, chordal action, DC generator, dynamometer, electric vehicle, inertia rollers, load bank, powertrain, pulse width modulation, reduction ratio, road load, testbench

Procedia PDF Downloads 228
2278 Predicting Polyethylene Processing Properties Based on Reaction Conditions via a Coupled Kinetic, Stochastic and Rheological Modelling Approach

Authors: Kristina Pflug, Markus Busch

Abstract:

Being able to predict polymer properties and processing behavior based on the applied operating reaction conditions in one of the key challenges in modern polymer reaction engineering. Especially, for cost-intensive processes such as the high-pressure polymerization of low-density polyethylene (LDPE) with high safety-requirements, the need for simulation-based process optimization and product design is high. A multi-scale modelling approach was set-up and validated via a series of high-pressure mini-plant autoclave reactor experiments. The approach starts with the numerical modelling of the complex reaction network of the LDPE polymerization taking into consideration the actual reaction conditions. While this gives average product properties, the complex polymeric microstructure including random short- and long-chain branching is calculated via a hybrid Monte Carlo-approach. Finally, the processing behavior of LDPE -its melt flow behavior- is determined in dependence of the previously determined polymeric microstructure using the branch on branch algorithm for randomly branched polymer systems. All three steps of the multi-scale modelling approach can be independently validated against analytical data. A triple-detector GPC containing an IR, viscosimetry and multi-angle light scattering detector is applied. It serves to determine molecular weight distributions as well as chain-length dependent short- and long-chain branching frequencies. 13C-NMR measurements give average branching frequencies, and rheological measurements in shear and extension serve to characterize the polymeric flow behavior. The accordance of experimental and modelled results was found to be extraordinary, especially taking into consideration that the applied multi-scale modelling approach does not contain parameter fitting of the data. This validates the suggested approach and proves its universality at the same time. In the next step, the modelling approach can be applied to other reactor types, such as tubular reactors or industrial scale. Moreover, sensitivity analysis for systematically varying process conditions is easily feasible. The developed multi-scale modelling approach finally gives the opportunity to predict and design LDPE processing behavior simply based on process conditions such as feed streams and inlet temperatures and pressures.

Keywords: low-density polyethylene, multi-scale modelling, polymer properties, reaction engineering, rheology

Procedia PDF Downloads 123
2277 Mercury Contamination of Wetland Caused by Wastewater from Chlor-Alkali Industry

Authors: Mitsuo Yoshida

Abstract:

A significant mercury contamination of soil/sediment was unveiled by an environmental monitoring program in a wetland along La Plata River, west to Montevideo City, Uruguay. The mercury contamination was caused by industrial wastewater discharged from a chlor-alkali plant using a mercury-cell process. The contamination level is above 60 mg/kg in soil/sediment. Most of mercury (Hg) in the environment is inorganic, but some fractions are converted by bacteria to methylmercury (MeHg), a toxic organic compound. MeHg biologically accumulates through a food-chain and become serious public health risk. In order to clarify the contaminated part for countermeasure operation, an intervention value of mercury contamination of sediment/soil was defined as 15 mg/kg (total Hg) by the authority. According to the intervention value, mercury contaminated area in the La Plata site is approximately 48,280 m² and estimated total volume of contaminated sediments/soils was around 18,750 m³. The countermeasures to contaminated zone were proposed in two stages; (i) mitigation of risks for public health and (ii) site remediation. The first stage is an installation of fens and net around the contamination zone, for mitigating risks of exposure, inhalation, and intake. The food chain among wetland-river ecosystem was also interrupted by the installation of net and fens. The state of mercury contamination in La Plata site and plan of countermeasure was disclosed to local people and the public, and consensus on setting off-limit area was successfully achieved. Mass media also contribute to share the information on the contamination site. The cost for countermeasures was borne by the industry under the polluter-pay-principle.

Keywords: chlor-alkali plant, mercury contamination, polluter pay principle, Uruguay, wetland

Procedia PDF Downloads 137
2276 Finite Element Modeling and Analysis of Reinforced Concrete Coupled Shear Walls Strengthened with Externally Bonded Carbon Fiber Reinforced Polymer Composites

Authors: Sara Honarparast, Omar Chaallal

Abstract:

Reinforced concrete (RC) coupled shear walls (CSWs) are very effective structural systems in resisting lateral loads due to winds and earthquakes and are particularly used in medium- to high-rise RC buildings. However, most of existing old RC structures were designed for gravity loads or lateral loads well below the loads specified in the current modern seismic international codes. These structures may behave in non-ductile manner due to poorly designed joints, insufficient shear reinforcement and inadequate anchorage length of the reinforcing bars. This has been the main impetus to investigate an appropriate strengthening method to address or attenuate the deficiencies of these structures. The objective of this paper is to twofold: (i) evaluate the seismic performance of existing reinforced concrete coupled shear walls under reversed cyclic loading; and (ii) investigate the seismic performance of RC CSWs strengthened with externally bonded (EB) carbon fiber reinforced polymer (CFRP) sheets. To this end, two CSWs were considered as follows: (a) the first one is representative of old CSWs and therefore was designed according to the 1941 National Building Code of Canada (NBCC, 1941) with conventionally reinforced coupling beams; and (b) the second one, representative of new CSWs, was designed according to modern NBCC 2015 and CSA/A23.3 2014 requirements with diagonally reinforced coupling beam. Both CSWs were simulated using ANSYS software. Nonlinear behavior of concrete is modeled using multilinear isotropic hardening through a multilinear stress strain curve. The elastic-perfectly plastic stress-strain curve is used to simulate the steel material. Bond stress–slip is modeled between concrete and steel reinforcement in conventional coupling beam rather than considering perfect bond to better represent the slip of the steel bars observed in the coupling beams of these CSWs. The old-designed CSW was strengthened using CFRP sheets bonded to the concrete substrate and the interface was modeled using an adhesive layer. The behavior of CFRP material is considered linear elastic up to failure. After simulating the loading and boundary conditions, the specimens are analyzed under reversed cyclic loading. The comparison of results obtained for the two unstrengthened CSWs and the one retrofitted with EB CFRP sheets reveals that the strengthening method improves the seismic performance in terms of strength, ductility, and energy dissipation capacity.

Keywords: carbon fiber reinforced polymer, coupled shear wall, coupling beam, finite element analysis, modern code, old code, strengthening

Procedia PDF Downloads 197
2275 Block-Chain Land Administration Technology in Nigeria: Opportunities and Challenges

Authors: Babalola Sunday Oyetayo, Igbinomwanhia Uyi Osamwonyi, Idowu T. O., Herbert Tata

Abstract:

This paper explores the potential benefits of adopting blockchain technology in Nigeria's land administration systems while also addressing the challenges and implications of its implementation in the country's unique context. Through a comprehensive literature review and analysis of existing research, the paper delves into the key attributes of blockchain that can revolutionize land administration practices, with a particular focus on simplifying land registration procedures, expediting land title issuance, and enhancing data transparency and security. The decentralized and immutable nature of blockchain offers unique advantages, instilling trust and confidence in land transactions, which are especially crucial in Nigeria's land governance landscape. However, integrating blockchain in Nigeria's land administration ecosystem presents specific challenges, necessitating a critical evaluation of technical, socio-economic, and infrastructural barriers. These challenges encompass data privacy concerns, scalability, interoperability with outdated systems, and gaining acceptance from various stakeholders. By synthesizing these insights, the paper proposes strategies tailored to Nigeria's context to optimize the benefits of blockchain adoption while addressing the identified challenges. The research findings contribute significantly to the ongoing discourse on blockchain technology in Nigeria's land governance, offering evidence-based recommendations to policymakers, land administrators, and stakeholders. Ultimately, the paper aims to promote the effective utilization of blockchain, fostering efficiency, transparency, and trust in Nigeria's land administration systems to drive sustainable development and societal progress.

Keywords: block-chain, technology, stakeholders, land registration

Procedia PDF Downloads 71
2274 Transient Simulation Using SPACE for ATLAS Facility to Investigate the Effect of Heat Loss on Major Parameters

Authors: Suhib A. Abu-Seini, Kyung-Doo Kim

Abstract:

A heat loss model for ATLAS facility was introduced using SPACE code predefined correlations and various dialing factors. As all previous simulations were carried out using a heat loss free input; the facility was considered to be completely insulated and the core power was reduced by the experimentally measured values of heat loss to compensate to the account for the loss of heat, this study will consider heat loss throughout the simulation. The new heat loss model will be affecting SPACE code simulation as heat being leaked out of the system throughout a transient will alter many parameters corresponding to temperature and temperature difference. For that, a Station Blackout followed by a multiple Steam Generator Tube Rupture accident will be simulated using both the insulated system approach and the newly introduced heat loss input of the steady state. Major parameters such as system temperatures, pressure values, and flow rates to be put into comparison and various analysis will be suggested upon it as the experimental values will not be the reference to validate the expected outcome. This study will not only show the significance of heat loss consideration in the processes of prevention and mitigation of various incidents, design basis and beyond accidents as it will give a detailed behavior of ATLAS facility during both processes of steady state and major transient, but will also present a verification of how credible the data acquired of ATLAS are; since heat loss values for steady state were already mismatched between SPACE simulation results and ATLAS data acquiring system. Acknowledgement- This work was supported by the Korean institute of Energy Technology Evaluation and Planning (KETEP) and the Ministry of Trade, Industry & Energy (MOTIE) of the Republic of Korea.

Keywords: ATLAS, heat loss, simulation, SPACE, station blackout, steam generator tube rupture, verification

Procedia PDF Downloads 222
2273 Antimicrobial Activity of 2-Nitro-1-Propanol and Lauric Acid against Gram-Positive Bacteria

Authors: Robin Anderson, Elizabeth Latham, David Nisbet

Abstract:

Propagation and dissemination of antimicrobial resistant and pathogenic microbes from spoiled silages and composts represents a serious public health threat to humans and animals. In the present study, the antimicrobial activity of the short chain nitro-compound, 2-nitro-1-propanol (9 mM) as well as the medium chain fatty acid, lauric acid, and its glycerol monoester, monolaurin, (each at 25 and 17 µmol/mL, respectfully) were investigated against select pathogenic and multi-drug resistant antimicrobial resistant Gram-positive bacteria common to spoiled silages and composts. In an initial study, we found that growth rates of a multi-resistant Enterococcus faecalis (expressing resistance against erythromycin, quinupristin/dalfopristin and tetracycline) and Staphylococcus aureus strain 12600 (expressing resistance against erythromycin, linezolid, penicillin, quinupristin/dalfopristin and vancomycin) were more than 78% slower (P < 0.05) by 2-nitro-1-propanol treatment during culture (n = 3/treatment) in anaerobically prepared ½ strength Brain Heart Infusion broth at 37oC when compared to untreated controls (0.332 ± 0.04 and 0.108 ± 0.03 h-1, respectively). The growth rate of 2-nitro-1-propanol-treated Listeria monocytogenes was also decreased by 96% (P < 0.05) when compared to untreated controls cultured similarly (0.171 ± 0.01 h-1). Maximum optical densities measured at 600 nm were lower (P < 0.05) in 2-nitro-1-propanol-treated cultures (0.053 ± 0.01, 0.205 ± 0.02 and 0.041 ± 0.01, respectively) than in untreated controls (0.483 ± 0.02, 0.523 ± 0.01 and 0.427 ± 0.01, respectively) for E. faecalis, S. aureus and L. monocytogenes, respectively. When tested against mixed microbial populations during anaerobic 24 h incubation of spoiled silage, significant effects of treatment with 1 mg 2-nitro-1-propanol (approximately 9.5 µmol/g) or 5 mg lauric acid/g (approximately 25 µmol/g) on populations of wildtype Enterococcus and Listeria were not observed. Mixed populations treated with 5 mg monolaurin/g (approximately 17 µmol/g) had lower (P < 0.05) viable cell counts of wildtype enterococci than untreated controls after 6 h incubation (2.87 ± 1.03 versus 5.20 ± 0.25 log10 colony forming units/g, respectively) but otherwise significant effects of monolaurin were not observed. These results reveal differential susceptibility of multi-drug resistant enterococci and staphylococci as well as L. monocytogenes to the inhibitory activity of 2-nitro-1-propanol and the medium chain fatty acid, lauric acid and its glycerol monoester, monolaurin. Ultimately, these results may lead to improved treatment technologies to preserve the microbiological safety of silages and composts.

Keywords: 2-nitro-1-propanol, lauric acid, monolaurin, gram positive bacteria

Procedia PDF Downloads 108
2272 Dynamic Distribution Calibration for Improved Few-Shot Image Classification

Authors: Majid Habib Khan, Jinwei Zhao, Xinhong Hei, Liu Jiedong, Rana Shahzad Noor, Muhammad Imran

Abstract:

Deep learning is increasingly employed in image classification, yet the scarcity and high cost of labeled data for training remain a challenge. Limited samples often lead to overfitting due to biased sample distribution. This paper introduces a dynamic distribution calibration method for few-shot learning. Initially, base and new class samples undergo normalization to mitigate disparate feature magnitudes. A pre-trained model then extracts feature vectors from both classes. The method dynamically selects distribution characteristics from base classes (both adjacent and remote) in the embedding space, using a threshold value approach for new class samples. Given the propensity of similar classes to share feature distributions like mean and variance, this research assumes a Gaussian distribution for feature vectors. Subsequently, distributional features of new class samples are calibrated using a corrected hyperparameter, derived from the distribution features of both adjacent and distant base classes. This calibration augments the new class sample set. The technique demonstrates significant improvements, with up to 4% accuracy gains in few-shot classification challenges, as evidenced by tests on miniImagenet and CUB datasets.

Keywords: deep learning, computer vision, image classification, few-shot learning, threshold

Procedia PDF Downloads 64
2271 Identifying Pathogenic Mycobacterium Species Using Multiple Gene Phylogenetic Analysis

Authors: Lemar Blake, Chris Oura, Ayanna C. N. Phillips Savage

Abstract:

Improved DNA sequencing technology has greatly enhanced bacterial identification, especially for organisms that are difficult to culture. Mycobacteriosis with consistent hyphema, bilateral exophthalmia, open mouth gape and ocular lesions, were observed in various fish populations at the School of Veterinary Medicine, Aquaculture/Aquatic Animal Health Unit. Objective: To identify the species of Mycobacterium that is affecting aquarium fish at the School of Veterinary Medicine, Aquaculture/Aquatic Animal Health Unit. Method: A total of 13 fish samples were collected and analyzed via: Ziehl-Neelsen, conventional polymerase chain reaction (PCR) and real-time PCR. These tests were carried out simultaneously for confirmation. The following combination of conventional primers: 16s rRNA (564 bp), rpoB (396 bp), sod (408 bp) were used. Concatenation of the gene fragments was carried out to phylogenetically classify the organism. Results: Acid fast non-branching bacilli were detected in all samples from homogenized internal organs. All 13 acid fast samples were positive for Mycobacterium via real-time PCR. Partial gene sequences using all three primer sets were obtained from two samples and demonstrated a novel strain. A strain 99% related to Mycobacterium marinum was also confirmed in one sample, using 16srRNA and rpoB genes. The two novel strains were clustered with the rapid growers and strains that are known to affect humans. Conclusions: Phylogenetic analysis demonstrated two novel Mycobacterium strains with the potential of being zoonotic and one strain 99% related to Mycobacterium marinum.

Keywords: polymerase chain reaction, phylogenetic, DNA sequencing, zoonotic

Procedia PDF Downloads 142
2270 Time Driven Activity Based Costing Capability to Improve Logistics Performance: Application in Manufacturing Context

Authors: Siham Rahoui, Amr Mahfouz, Amr Arisha

Abstract:

In a highly competitive environment characterised by uncertainty and disruptions, such as the recent COVID-19 outbreak, supply chains (SC) face the challenge of maintaining their cost at minimum levels while continuing to provide customers with high-quality products and services. More importantly, businesses in such an economic context strive to maintain survival by keeping the cost of undertaken activities (such as logistics) low and in-house. To do so, managers need to understand the costs associated with different products and services in order to have a clear vision of the SC performance, maintain profitability levels, and make strategic decisions. In this context, SC literature explored different costing models that sought to determine the costs of undertaking supply chain-related activities. While some cost accounting techniques have been extensively explored in the SC context, more contributions are needed to explore the potential of time driven activity-based costing (TDABC). More specifically, more applications are needed in the manufacturing context of the SC, where the debate is ongoing. The aim of the study is to assess the capability of the technique to assess the operational performance of the logistics function. Through a case study methodology applied to a manufacturing company operating in the automotive industry, TDABC evaluates the efficiency of the current configuration and its logistics processes. The study shows that monitoring the process efficiency and cost efficiency leads to strategic decisions that contributed to improve the overall efficiency of the logistics processes.

Keywords: efficiency, operational performance, supply chain costing, time driven activity based costing

Procedia PDF Downloads 162
2269 Development of an Appropriate Method for the Determination of Multiple Mycotoxins in Pork Processing Products by UHPLC-TCFLD

Authors: Jason Gica, Yi-Hsieng Samuel Wu, Deng-Jye Yang, Yi-Chen Chen

Abstract:

Mycotoxins, harmful secondary metabolites produced by certain fungi species, pose significant risks to animals and humans worldwide. Their stable properties lead to contamination during grain harvesting, transportation, and storage, as well as in processed food products. The prevalence of mycotoxin contamination has attracted significant attention due to its adverse impact on food safety and global trade. The secondary contamination pathway from animal products has been identified as an important route of exposure, posing health risks for livestock and humans consuming contaminated products. Pork, one of the highly consumed meat products in Taiwan according to the National Food Consumption Database, plays a critical role in the nation's diet and economy. Given its substantial consumption, pork processing products are a significant component of the food supply chain and a potential source of mycotoxin contamination. This study is paramount for formulating effective regulations and strategies to mitigate mycotoxin-related risks in the food supply chain. By establishing a reliable analytical method, this research contributes to safeguarding public health and enhancing the quality of pork processing products. The findings will serve as valuable guidance for policymakers, food industries, and consumers to ensure a safer food supply chain in the face of emerging mycotoxin challenges. An innovative and efficient analytical approach is proposed using Ultra-High Performance Liquid Chromatography coupled with Temperature Control Fluorescence Detector Light (UHPLC-TCFLD) to determine multiple mycotoxins in pork meat samples due to its exceptional capacity to detect multiple mycotoxins at the lowest levels of concentration, making it highly sensitive and reliable for comprehensive mycotoxin analysis. Additionally, its ability to simultaneously detect multiple mycotoxins in a single run significantly reduces the time and resources required for analysis, making it a cost-effective solution for monitoring mycotoxin contamination in pork processing products. The research aims to optimize the efficient mycotoxin QuEChERs extraction method and rigorously validate its accuracy and precision. The results will provide crucial insights into mycotoxin levels in pork processing products.

Keywords: multiple-mycotoxin analysis, pork processing products, QuEChERs, UHPLC-TCFLD, validation

Procedia PDF Downloads 67
2268 Fundamentals of Mobile Application Architecture

Authors: Mounir Filali

Abstract:

Companies use many innovative ways to reach their customers to stay ahead of the competition. Along with the growing demand for innovative business solutions is the demand for new technology. The most noticeable area of demand for business innovations is the mobile application industry. Recently, companies have recognized the growing need to integrate proprietary mobile applications into their suite of services; Companies have realized that developing mobile apps gives them a competitive edge. As a result, many have begun to rapidly develop mobile apps to stay ahead of the competition. Mobile application development helps companies meet the needs of their customers. Mobile apps also help businesses to take advantage of every potential opportunity to generate leads that convert into sales. Mobile app download growth statistics with the recent rise in demand for business-related mobile apps, there has been a similar rise in the range of mobile app solutions being offered. Today, companies can use the traditional route of the software development team to build their own mobile applications. However, there are also many platform-ready "low-code and no-code" mobile apps available to choose from. These mobile app development options have more streamlined business processes. This helps them be more responsive to their customers without having to be coding experts. Companies must have a basic understanding of mobile app architecture to attract and maintain the interest of mobile app users. Mobile application architecture refers to the buildings or structural systems and design elements that make up a mobile application. It also includes the technologies, processes, and components used during application development. The underlying foundation of all applications consists of all elements of the mobile application architecture; developing a good mobile app architecture requires proper planning and strategic design. The technology framework or platform on the back end and user-facing side of a mobile application is part of the mobile architecture of the application. In-application development Software programmers loosely refer to this set of mobile architecture systems and processes as the "technology stack."

Keywords: mobile applications, development, architecture, technology

Procedia PDF Downloads 104
2267 The Applications of Toyota Production System to Reduce Wastes in Agricultural Products Packing Process: A Study of Onion Packing Plant

Authors: P. Larpsomboonchai

Abstract:

Agro-industry is one of major industries that has strong impacts on national economic incomes, growth, stability, and sustainable development. Moreover, this industry also has strong influences on social, cultural and political issues. Furthermore, this industry, as producing primary and secondary products, is facing challenges from such diverse factors such as demand inconsistency, intense international competition, technological advancements and new competitors. In order to maintain and to improve industry’s competitiveness in both domestics and international markets, science and technology are key factors. Besides hard sciences and technologies, modern industrial engineering concepts such as Just in Time (JIT) Total Quality Management (TQM), Quick Response (QR), Supply Chain Management (SCM) and Lean can be very effective to supportant to increase efficiency and effectiveness of these agricultural products on world stage. Onion is one of Thailand’s major export products which brings back national incomes. But, it also facing challenges in many ways. This paper focused its interests in onion packing process and its related activities such as storage and shipment from one of major packing plant and storage in Mae Wang District, Chiang Mai, Thailand, by applying Toyota Production System (TPS) or Lean concepts, to improve process capability throughout the entire packing and distribution process which will be profitable for the whole onion supply chain. And it will be beneficial to other related agricultural products in Thailand and other ASEAN countries.

Keywords: packing process, Toyota Production System (TPS), lean concepts, waste reduction, lean in agro-industries activities

Procedia PDF Downloads 273
2266 Engineering Thermal-Hydraulic Simulator Based on Complex Simulation Suite “Virtual Unit of Nuclear Power Plant”

Authors: Evgeny Obraztsov, Ilya Kremnev, Vitaly Sokolov, Maksim Gavrilov, Evgeny Tretyakov, Vladimir Kukhtevich, Vladimir Bezlepkin

Abstract:

Over the last decade, a specific set of connected software tools and calculation codes has been gradually developed. It allows simulating I&C systems, thermal-hydraulic, neutron-physical and electrical processes in elements and systems at the Unit of NPP (initially with WWER (pressurized water reactor)). In 2012 it was called a complex simulation suite “Virtual Unit of NPP” (or CSS “VEB” for short). Proper application of this complex tool should result in a complex coupled mathematical computational model. And for a specific design of NPP, it is called the Virtual Power Unit (or VPU for short). VPU can be used for comprehensive modelling of a power unit operation, checking operator's functions on a virtual main control room, and modelling complicated scenarios for normal modes and accidents. In addition, CSS “VEB” contains a combination of thermal hydraulic codes: the best-estimate (two-liquid) calculation codes KORSAR and CORTES and a homogenous calculation code TPP. So to analyze a specific technological system one can build thermal-hydraulic simulation models with different detalization levels up to a nodalization scheme with real geometry. And the result at some points is similar to the notion “engineering/testing simulator” described by the European utility requirements (EUR) for LWR nuclear power plants. The paper is dedicated to description of the tools mentioned above and an example of the application of the engineering thermal-hydraulic simulator in analysis of the boron acid concentration in the primary coolant (changed by the make-up and boron control system).

Keywords: best-estimate code, complex simulation suite, engineering simulator, power plant, thermal hydraulic, VEB, virtual power unit

Procedia PDF Downloads 379
2265 HIV Incidence among Men Who Have Sex with Men Measured by Pooling Polymerase Chain Reaction, and Its Comparison with HIV Incidence Estimated by BED-Capture Enzyme-Linked Immunosorbent Assay and Observed in a Prospective Cohort

Authors: Mei Han, Jinkou Zhao, Yuan Yao, Liangui Feng, Xianbin Ding, Guohui Wu, Chao Zhou, Lin Ouyang, Rongrong Lu, Bo Zhang

Abstract:

To compare the HIV incidence estimated using BED capture enzyme linked immunosorbent assay (BED-CEIA) and observed in a cohort against the HIV incidence among men who have sex with men (MSM) measured by pooling polymerase chain reaction (pooling-PCR). A total of 617 MSM subjects were included in a respondent driven sampling survey in Chongqing in 2008. Among the 129 that were tested HIV antibody positive, 102 were defined with long-term infection, 27 were assessed for recent HIV infection (RHI) using BED-CEIA. The remaining 488 HIV negative subjects were enrolled to the prospective cohort and followed-up every 6 months to monitor HIV seroconversion. All of the 488 HIV negative specimens were assessed for acute HIV infection (AHI) using pooling-PCR. Among the 488 negative subjects in the open cohort, 214 (43.9%) were followed-up for six months, with 107 person-years of observation and 14 subjects seroconverted. The observed HIV incidence was 12.5 per 100 person-years (95% CI=9.1-15.7). Among the 488 HIV negative specimens, 5 were identified with acute HIV infection using pooling-PCR at an annual rate of 14.02% (95% CI=1.73-26.30). The estimated HIV-1 incidence was 12.02% (95% CI=7.49-16.56) based on BED-CEIA. The HIV incidence estimated with three different approaches was different among subgroups. In the highly HIV prevalent MSM, it costs US$ 1724 to detect one AHI case, while detection of one case of RHI with BED assay costs only US$ 42. Three approaches generated comparable and high HIV incidences, pooling PCR and prospective cohort are more close to the true level of incidence, while BED-CEIA seemed to be the most convenient and economical approach for at-risk population’s HIV incidence evaluation at the beginning of HIV pandemic. HIV-1 incidences were alarmingly high among MSM population in Chongqing, particularly within the subgroup under 25 years of age and those migrants aged between 25 to 34 years.

Keywords: BED-CEIA, HIV, incidence, pooled PCR, prospective cohort

Procedia PDF Downloads 410
2264 Enhancement Method of Network Traffic Anomaly Detection Model Based on Adversarial Training With Category Tags

Authors: Zhang Shuqi, Liu Dan

Abstract:

For the problems in intelligent network anomaly traffic detection models, such as low detection accuracy caused by the lack of training samples, poor effect with small sample attack detection, a classification model enhancement method, F-ACGAN(Flow Auxiliary Classifier Generative Adversarial Network) which introduces generative adversarial network and adversarial training, is proposed to solve these problems. Generating adversarial data with category labels could enhance the training effect and improve classification accuracy and model robustness. FACGAN consists of three steps: feature preprocess, which includes data type conversion, dimensionality reduction and normalization, etc.; A generative adversarial network model with feature learning ability is designed, and the sample generation effect of the model is improved through adversarial iterations between generator and discriminator. The adversarial disturbance factor of the gradient direction of the classification model is added to improve the diversity and antagonism of generated data and to promote the model to learn from adversarial classification features. The experiment of constructing a classification model with the UNSW-NB15 dataset shows that with the enhancement of FACGAN on the basic model, the classification accuracy has improved by 8.09%, and the score of F1 has improved by 6.94%.

Keywords: data imbalance, GAN, ACGAN, anomaly detection, adversarial training, data augmentation

Procedia PDF Downloads 102
2263 Migration Law in Republic of Panama

Authors: Ronel Solis, Leonardo Collado

Abstract:

Migration law in the Republic of Panama has been regulated mainly by the executive branch. This has created a crisis not only institutional but also social because the evolution of these norms has rested greatly from the discretion of the government in office. This has created instability in immigration regulation and more now, with the migration crisis of which Panama is also part. Different migration policies have been established. The most recent is that of the controlled migration flow, in which, for humanitarian reasons, migrants move from the border with Colombia to the border with Costa Rica. Unfortunately, such control is not enough, and in some cases, unprotected migrants have been confined for months, their passports have been withheld, and no recognition of their rights is offered. The Inter-American Court of Human Rights has condemned Panama for the unfair detention of an irregular migrant, who was detained for two years in Panamanian prisons, without having committed a crime and without accessing a just defense. This is the case Vélez Loor vs. the Republic of Panama. Uncontrollable migration has been putting pressure on Panamanian public health services. The recent denunciation of HIV-related NGOs that warns that there are hundreds of foreigners who receive expensive antiretroviral therapy in Panama is serious, and several of them are irregular migrants. On the other hand, there are no border control posts with the Republic of Colombia, because it is a jungle area and migrants are exposed to arms and drug trafficking, and unfortunately, also to prostitution. Government entities such as the border police service have provided humanitarian support to migrants on the border with Colombia, although it is not their administrative function, and various entities discuss who should address this crisis. However, few economic resources are allocated by the government to solve this problem, especially with the recent mass migration of Venezuelans who have fled their country. The establishment of a migratory normative code is necessary to establish uniformity in the recognition and application of migratory rights. In this way, dependence on the changing migration policies of the different Panamanian governments would be eliminated, and the rights of migrants and nationals would be guaranteed.

Keywords: executive branch, irregular migration, migration code, Republic of Panama

Procedia PDF Downloads 121
2262 Simulation and Characterization of Compact Magnetic Proton Recoil Spectrometer for Fast Neutron Spectra Measurements

Authors: Xingyu Peng, Qingyuan Hu, Xuebin Zhu, Xi Yuan

Abstract:

Neutron spectrometry has contributed much to the development of nuclear physics since 1932 and has also become an importance tool in several other fields, notably nuclear technology, fusion plasma diagnostics and radiation protection. Compared with neutron fluxes, neutron spectra can provide more detailed information on the internal physical process of neutron sources, such as fast neutron reactors, fusion plasma, fission-fusion hybrid reactors, and so on. However, high performance neutron spectrometer is not so commonly available as it requires the use of large and complex instrumentation. This work describes the development and characterization of a compact magnetic proton recoil (MPR) spectrometer for high-resolution measurements of fast neutron spectra. The compact MPR spectrometer is featured by its large recoil angle, small size permanent analysis magnet, short beam transport line and dual-purpose detector array for both steady state and pulsed neutron spectra measurement. A 3-dimensional electromagnetic particle transport code is developed to simulate the response function of the spectrometer. Simulation results illustrate that the performance of the spectrometer is mainly determined by n-p recoil foil and proton apertures, and an overall energy resolution of 3% is achieved for 14 MeV neutrons. Dedicated experiments using alpha source and mono-energetic neutron beam are employed to verify the simulated response function of the compact MPR spectrometer. These experimental results show a good agreement with the simulated ones, which indicates that the simulation code possesses good accuracy and reliability. The compact MPR spectrometer described in this work is a valuable tool for fast neutron spectra measurements for the fission or fusion devices.

Keywords: neutron spectrometry, magnetic proton recoil spectrometer, neutron spectra, fast neutron

Procedia PDF Downloads 201
2261 Mobile App Architecture in 2023: Build Your Own Mobile App

Authors: Mounir Filali

Abstract:

Companies use many innovative ways to reach their customers to stay ahead of the competition. Along with the growing demand for innovative business solutions is the demand for new technology. The most noticeable area of demand for business innovations is the mobile application industry. Recently, companies have recognized the growing need to integrate proprietary mobile applications into their suite of services; Companies have realized that developing mobile apps gives them a competitive edge. As a result, many have begun to rapidly develop mobile apps to stay ahead of the competition. Mobile application development helps companies meet the needs of their customers. Mobile apps also help businesses to take advantage of every potential opportunity to generate leads that convert into sales. Mobile app download growth statistics with the recent rise in demand for business-related mobile apps, there has been a similar rise in the range of mobile app solutions being offered. Today, companies can use the traditional route of the software development team to build their own mobile applications. However, there are also many platform-ready "low-code and no-code" mobile apps available to choose from. These mobile app development options have more streamlined business processes. This helps them be more responsive to their customers without having to be coding experts. Companies must have a basic understanding of mobile app architecture to attract and maintain the interest of mobile app users. Mobile application architecture refers to the buildings or structural systems and design elements that make up a mobile application. It also includes the technologies, processes, and components used during application development. The underlying foundation of all applications consists of all elements of the mobile application architecture, developing a good mobile app architecture requires proper planning and strategic design. The technology framework or platform on the back end and user-facing side of a mobile application is part of the mobile architecture of the application. In-application development Software programmers loosely refer to this set of mobile architecture systems and processes as the "technology stack".

Keywords: mobile applications, development, architecture, technology

Procedia PDF Downloads 100
2260 Understanding Cyber Kill Chains: Optimal Allocation of Monitoring Resources Using Cooperative Game Theory

Authors: Roy. H. A. Lindelauf

Abstract:

Cyberattacks are complex processes consisting of multiple interwoven tasks conducted by a set of agents. Interdictions and defenses against such attacks often rely on cyber kill chain (CKC) models. A CKC is a framework that tries to capture the actions taken by a cyber attacker. There exists a growing body of literature on CKCs. Most of this work either a) describes the CKC with respect to one or more specific cyberattacks or b) discusses the tools and technologies used by the attacker at each stage of the CKC. Defenders, facing scarce resources, have to decide where to allocate their resources given the CKC and partial knowledge on the tools and techniques attackers use. In this presentation CKCs are analyzed through the lens of covert projects, i.e., interrelated tasks that have to be conducted by agents (human and/or computer) with the aim of going undetected. Various aspects of covert project models have been studied abundantly in the operations research and game theory domain, think of resource-limited interdiction actions that maximally delay completion times of a weapons project for instance. This presentation has investigated both cooperative and non-cooperative game theoretic covert project models and elucidated their relation to CKC modelling. To view a CKC as a covert project each step in the CKC is broken down into tasks and there are players of which each one is capable of executing a subset of the tasks. Additionally, task inter-dependencies are represented by a schedule. Using multi-glove cooperative games it is shown how a defender can optimize the allocation of his scarce resources (what, where and how to monitor) against an attacker scheduling a CKC. This study presents and compares several cooperative game theoretic solution concepts as metrics for assigning resources to the monitoring of agents.

Keywords: cyber defense, cyber kill chain, game theory, information warfare techniques

Procedia PDF Downloads 139
2259 Integrated Simulation and Optimization for Carbon Capture and Storage System

Authors: Taekyoon Park, Seokgoo Lee, Sungho Kim, Ung Lee, Jong Min Lee, Chonghun Han

Abstract:

CO2 capture and storage/sequestration (CCS) is a key technology for addressing the global warming issue. This paper proposes an integrated model for the whole chain of CCS, from a power plant to a reservoir. The integrated model is further utilized to determine optimal operating conditions and study responses to various changes in input variables.

Keywords: CCS, caron dioxide, carbon capture and storage, simulation, optimization

Procedia PDF Downloads 348