Search results for: computer processing of large databases
3062 Next-Generation Laser-Based Transponder and 3D Switch for Free Space Optics in Nanosatellite
Authors: Nadir Atayev, Mehman Hasanov
Abstract:
Future spacecraft will require a structural change in the way data is transmitted due to the increase in the volume of data required for space communication. Current radio frequency communication systems are already facing a bottleneck in the volume of data sent to the ground segment due to their technological and regulatory characteristics. To overcome these issues, free space optics communication plays an important role in the integrated terrestrial space network due to its advantages such as significantly improved data rate compared to traditional RF technology, low cost, improved security, and inter-satellite free space communication, as well as uses a laser beam, which is an optical signal carrier to establish satellite-ground & ground-to-satellite links. In this approach, there is a need for high-speed and energy-efficient systems as a base platform for sending high-volume video & audio data. Nano Satellite and its branch CubeSat platforms have more technical functionality than large satellites, wheres cover an important part of the space sector, with their Low-Earth-Orbit application area with low-cost design and technical functionality for building networks using different communication topologies. Along the research theme developed in this regard, the output parameter indicators for the FSO of the optical communication transceiver subsystem on the existing CubeSat platforms, and in the direction of improving the mentioned parameters of this communication methodology, 3D optical switch and laser beam controlled optical transponder with 2U CubeSat structural subsystems and application in the Low Earth Orbit satellite network topology, as well as its functional performance and structural parameters, has been studied accordingly.Keywords: cubesat, free space optics, nano satellite, optical laser communication.
Procedia PDF Downloads 893061 Enzyme Immobilization on Functionalized Polystyrene Nanofibersfor Bioprocessing Applications
Authors: Mailin Misson, Bo Jin, Sheng Dai, Hu Zhang
Abstract:
Advances in biotechnology have witnessed a growing interest in enzyme applications for the development of green and sustainable bio processes. While known as powerful bio catalysts, enzymes are no longer of economic value when extended to large commercialization. Alternatively, immobilization technology allows enzyme recovery and continuous reuse which subsequently compensates high operating costs. Employment of enzymes on nano structured materials has been recognized as a promising approach to enhance enzyme catalytic performances. High porosity, inter connectivity and self-assembling behaviors endow nano fibers as exciting candidate for enzyme carrier in bio reactor systems. In this study, nano fibers were successfully fabricated via electro spinning system by optimizing the polymer concentration (10-30 %, w/v), applied voltage (10-30 kV) and discharge distance (11-26 cm). Microscopic images have confirmed the quality as homogeneous and good fiber alignment. The nano fibers surface was modified using strong oxidizing agent to facilitate bio molecule binding. Bovine serum albumin and β-galactosidase enzyme were employed as model bio catalysts and immobilized onto the oxidized surfaces through covalent binding. Maximum enzyme adsorption capacity of the modified nano fibers was 3000 mg/g, 3-fold higher than the unmodified counterpart (1000 mg/g). The highest immobilization yield was 80% and reached the saturation point at 2 mg/ml of enzyme concentration. The results indicate a significant increase of activity retention by the enzyme-bound modified nano fibers (80%) as compared to the nascent one (60%), signifying excellent enzyme-nano carrier bio compatibility. The immobilized enzyme was further used for the bio conversion of dairy wastes into value-added products. This study demonstrates great potential of acid-modified electrospun polystyrene nano fibers as enzyme carriers.Keywords: immobilization, enzyme, nanocarrier, nanofibers
Procedia PDF Downloads 2933060 Ripening Conditions Suitable for Marketing of Winter Squash ‘Bochang’
Authors: Do Su Park, Sang Jun Park, Cheon Soon Jeong
Abstract:
This study was performed in order to investigate the optimum ripening conditions for the marketing of Squash. Research sample 'Bochang' was grown at Hongcheonin in Gangwon province in August 2014. Ripening the samples were stored under the conditions of 25℃, 30℃, and 35℃ with the humidity RH70 ± 5%. They were checked every 3 days for 21 days. The respiration rate, water loss, hardness, coloration, the contents of soluble solids, starch, total sugar were evaluated after storage. Respiration rate was reduced in all treatments with longer storage period. Water loss was increased in the higher temperature. The 13% water loss was found at 35℃ on 21st storage day. The store initially 25℃ and 30℃ Hardness 47N and the ripening 21 days decreased slightly. On the other hand, in the case of 35℃ showed a large reduction than 25℃ and 30℃. Soluble solid contents were increased with longer ripening period. 30℃ and 35℃ was highest ripening 15 days. In the case of 25℃, it was highest on 21th day. The higher the temperature, the higher the soluble solids content are. 25℃ and 30℃ Coloration was increased rapidly until the ripening 12 days. In case of 35℃, continued increase up to 21 days. 25℃ and 30℃ showed no differences. Meanwhile, in case of 35℃, appearance quality was reduced in Occurrence of yellowing phenomenon of pericarp occurs from after ripening for 9 days. The coloration of fruit flesh is increase until after ripening for 9 days and decrease from after ripening for 9 days. There was no significant difference depending on the conditions of temperature. The higher the temperature, the lower the content of the starch. In case of 30℃ and 35℃, was reduced with longer storage period. 25℃ was minimal content change. Total sugar was increased in all treatments with longer storage period. The higher the temperature, the higher the amount of total sugar content is. Therefore, at 25℃ for 18-21 days and at 30℃ for 12-15 days is suitable for ripening.Keywords: marketing, ripening, temperature, winter squash
Procedia PDF Downloads 5983059 Strategic Business Solutions for an Ageing SME
Authors: N. G. Teik Hiang, Fathyah Hashim
Abstract:
This is a case of how strategic management techniques can be used to help resolving problems faced by an ageing Small and Medium Enterprise (SME). Strategic way of resolving problems had been proven to be possible in this case despite general thought that strategic management is useful mostly for large corporations. Small and Medium Enterprises (SMEs) can also use strategic management in managing their business and determining their future cause of action and strategies in order to survive in this ever competent world. Strategic orientation is the key to survival and development of small and medium enterprises. In order to adapt to the fierce market competition, ageing SMEs should improve competitiveness and operational efficiency. They must therefore establish a sense of strategic management to improve the strategic management skills, combined with its own unique characteristics, and work out practical strategies to develop core competitiveness of enterprises in the fierce market competition in order to be sustainable. In this case, internal strengths and weaknesses of an SME had been identified. Strategic internal factors and external factors had been classified and further utilized to formulate potential strategies to encounter various problems faced by the SME. These strategies had been further match to take advantages of the opportunities and to overcome the weaknesses and minimize the threats it is facing. Tan, a consultant who was given the opportunity to formulate a plan for the business started with the environmental scanning (internal and external environmental analysis), assessing strengths and weaknesses for the company, strategies generation, analysis and evaluation. He had numerous discussions with the owner of the business and the senior management in order to match the key internal and external factors to formulate alternative strategies for solving the problems that the company facing. Some of the recommendations or solutions are generated from the inspiration of the owner of the business who is a very enterprising and experience businessman.Keywords: strategic orientation, strategic management, SME, core competitiveness, sustainable
Procedia PDF Downloads 4193058 An Adaptive Conversational AI Approach for Self-Learning
Authors: Airy Huang, Fuji Foo, Aries Prasetya Wibowo
Abstract:
In recent years, the focus of Natural Language Processing (NLP) development has been gradually shifting from the semantics-based approach to deep learning one, which performs faster with fewer resources. Although it performs well in many applications, the deep learning approach, due to the lack of semantics understanding, has difficulties in noticing and expressing a novel business case with a pre-defined scope. In order to meet the requirements of specific robotic services, deep learning approach is very labor-intensive and time consuming. It is very difficult to improve the capabilities of conversational AI in a short time, and it is even more difficult to self-learn from experiences to deliver the same service in a better way. In this paper, we present an adaptive conversational AI algorithm that combines both semantic knowledge and deep learning to address this issue by learning new business cases through conversations. After self-learning from experience, the robot adapts to the business cases originally out of scope. The idea is to build new or extended robotic services in a systematic and fast-training manner with self-configured programs and constructed dialog flows. For every cycle in which a chat bot (conversational AI) delivers a given set of business cases, it is trapped to self-measure its performance and rethink every unknown dialog flows to improve the service by retraining with those new business cases. If the training process reaches a bottleneck and incurs some difficulties, human personnel will be informed of further instructions. He or she may retrain the chat bot with newly configured programs, or new dialog flows for new services. One approach employs semantics analysis to learn the dialogues for new business cases and then establish the necessary ontology for the new service. With the newly learned programs, it completes the understanding of the reaction behavior and finally uses dialog flows to connect all the understanding results and programs, achieving the goal of self-learning process. We have developed a chat bot service mounted on a kiosk, with a camera for facial recognition and a directional microphone array for voice capture. The chat bot serves as a concierge with polite conversation for visitors. As a proof of concept. We have demonstrated to complete 90% of reception services with limited self-learning capability.Keywords: conversational AI, chatbot, dialog management, semantic analysis
Procedia PDF Downloads 1363057 Cable De-Commissioning of Legacy Accelerators at CERN
Authors: Adya Uluwita, Fernando Pedrosa, Georgi Georgiev, Christian Bernard, Raoul Masterson
Abstract:
CERN is an international organisation funded by 23 countries that provide the particle physics community with excellence in particle accelerators and other related facilities. Founded in 1954, CERN has a wide range of accelerators that allow groundbreaking science to be conducted. Accelerators bring particles to high levels of energy and make them collide with each other or with fixed targets, creating specific conditions that are of high interest to physicists. A chain of accelerators is used to ramp up the energy of particles and eventually inject them into the largest and most recent one: the Large Hadron Collider (LHC). Among this chain of machines is, for instance the Proton Synchrotron, which was started in 1959 and is still in operation. These machines, called "injectors”, keep evolving over time, as well as the related infrastructure. Massive decommissioning of obsolete cables started in 2015 at CERN in the frame of the so-called "injectors de-cabling project phase 1". Its goal was to replace aging cables and remove unused ones, freeing space for new cables necessary for upgrades and consolidation campaigns. To proceed with the de-cabling, a project co-ordination team was assembled. The start of this project led to the investigation of legacy cables throughout the organisation. The identification of cables stacked over half a century proved to be arduous. Phase 1 of the injectors de-cabling was implemented for 3 years with success after overcoming some difficulties. Phase 2, started 3 years later, focused on improving safety and structure with the introduction of a quality assurance procedure. This paper discusses the implementation of this quality assurance procedure throughout phase 2 of the project and the transition between the two phases. Over hundreds of kilometres of cable were removed in the injectors complex at CERN from 2015 to 2023.Keywords: CERN, de-cabling, injectors, quality assurance procedure
Procedia PDF Downloads 933056 Intercropping Immature Oil Palm (Elaeisguineensis) with Banana, Ginger and Turmeric in Galle District, Sri Lanka
Authors: S. M. Dissanayake, I. R. Palihakkara , K. G. Premathilaka
Abstract:
Oil palm (Elaeisguineensis) is the world’s leading vegetable oil-producing plant and is well established as a perennial plantation crop in tropical countries. Oil palm in Sri Lanka has spread over 10,000 hectares in the wet zone of the Island. In immature plantations, land productivity can be increased with some selected intercrops. At the immature stage of the plantations (age up to 3-5 years), there is a large amount of free space available inside the plantations. This study attempts to determine the suitability of different intercrops during the immature phase of the oil palm. A field experiment is being conducted at Thalgaswella estate (WL2a) in Galle district, Sri Lanka. The objectives of the study are to evaluate and recommend a suitable immature oil palm-based intercropping system/s. This experiment was established with randomized complete block design (RCBD) with four treatments, including control in three replicates. Banana, ginger, and turmeric were selected as intercrops. Growth parameters of intercrops (plant height, length, width of D-leaf, and yield of intercrops) and girth, length, and number of leaflets of 17th frond in oil palms were taken at two months intervals. In addition to this, chlorophyll content was also measured in both intercrops and oil palm trees. Soil chemical parameters were measured annually. Results were statistically analyzed with SAS software. Results revealed that intercropped banana, turmeric, and ginger had given yields of 7.61Mt/ha, 4.92Mt/ha, and 4.53Mt/ha, respectively. When comparing these yields with mono-crop, banana, turmeric, and ginger intercrop yields as percentages of 16.9%, 24.6%, and 30.2%, respectively. The results of this study could be used to make appropriate policies to increase the unit land productivity in oil palm plantations in a low country wet zone (WL2a) of Sri Lanka.Keywords: inter-cropping, oil palm, policies, mono-crop, land productivity
Procedia PDF Downloads 1593055 Ability of Gastric Enzyme Extract of Adult Camel to Clot Bovine Milk
Authors: Boudjenah-Haroun Saliha, Isselnane Souad, Nouani Abdelwahab, Baaissa Babelhadj, Mati Abderrahmane
Abstract:
Algeria is experiencing significant development of the dairy sector, where consumption of milk and milk products increased by 2.7 million tons in 2008 to 4,400,000 tons in 2013, and cheese production has reached 1640 tons in the year 2014 with average consumption of 0.7 kg/person/year. Although rennet is still the most used coagulating enzyme in cheese, its production has been growing worldwide shortage. This shortage is primarily due to a growing increase in the production and consumption of cheese, and the inability to increase in parallel the production of rennet. This shortage has caused very large fluctuations in its price). To overcome these obstacles, much research has been undertaken to find effective and competitive substitutes used industrially. For this, the selection of a local production of rennet substitute is desirable. It would allow a permanent supply with limited dependence on imports and price fluctuations. Investigations conducted by our research team showed that extracts coagulants from the stomachs of older camels are characterized by a coagulating power than those from younger camels. The objective of this work is to study the possibility of substituting commercial rennet coagulant by gastric enzymes from adult camels for coagulation bovine milk. Excerpts from the raw camel coagulants obtained are characterized through their teneures proteins and clotting and proteolytic activities. Milk clotting conditions by the action of these extracts were optimized. Milk clotting time all treated with enzyme preparations and under different conditions was calculated. Bovine rennet has been used for comparison. The results show that crude extracts from gastric adult camel can be good substituting bovine rennet.Keywords: Algeria, camel, cheese, coagulation, gastric extracts, milk
Procedia PDF Downloads 4413054 Model of Community Management for Sustainable Utilization
Authors: Luedech Girdwichai, Withaya Mekhum
Abstract:
This research intended to develop the model of community management for sustainable utilization by investigating on 2 groups of population, the family heads and the community management team. The population of the former group consisted of family heads from 511 families in 12 areas to complete the questionnaires which were returned at 479 sets. The latter group consisted of the community management team of 12 areas with 1 representative from each area to give the interview. The questionnaires for the family heads consisted of 2 main parts; general information such as occupations, etc. in the form of checklist. The second part dealt with the data on self reliance community development based on 4P Framework, i.e., People (human resource) development, Place (area) development, Product (economic and income source) development, and Plan (community plan) development in the form of rating scales. Data in the 1st part were calculated to find frequency and percentage while those in the 2nd part were analyzed to find arithmetic mean and SD. Data from the 2nd group of population or the community management team were derived from focus group to find factors influencing successful management together with the in depth interview which were analyzed by descriptive statistics. The results showed that 479 family heads reported that the aspect on the implementation of community plan to self reliance community activities based on Sufficient Economy Philosophy and the 4P was at the average of 3.28 or moderate level. When considering in details, it was found that the 1st aspect was on the area development with the mean of 3.71 or high level followed by human resource development with the mean of 3.44 or moderate level, then, economic and source of income development with the mean of 3.09 or moderate level. The last aspect was community plan development with the mean of 2.89. The results from the small group discussion revealed some factors and guidelines for successful community management as follows: 1) on the People (human resource) development aspect, there was a project to support and develop community leaders. 2) On the aspect of Place (area) development, there was a development on conservative tourism areas. 3) On the aspect of Product (economic and source of income) development, the community leaders promoted the setting of occupational group, saving group, and product processing group. 4) On the aspect of Plan (community plan) development, there was a prioritization through public hearing.Keywords: model of community management, sustainable utilization, family heads, community management team
Procedia PDF Downloads 3403053 Evaluating the Capability of the Flux-Limiter Schemes in Capturing the Turbulence Structures in a Fully Developed Channel Flow
Authors: Mohamed Elghorab, Vendra C. Madhav Rao, Jennifer X. Wen
Abstract:
Turbulence modelling is still evolving, and efforts are on to improve and develop numerical methods to simulate the real turbulence structures by using the empirical and experimental information. The monotonically integrated large eddy simulation (MILES) is an attractive approach for modelling turbulence in high Re flows, which is based on the solving of the unfiltered flow equations with no explicit sub-grid scale (SGS) model. In the current work, this approach has been used, and the action of the SGS model has been included implicitly by intrinsic nonlinear high-frequency filters built into the convection discretization schemes. The MILES solver is developed using the opensource CFD OpenFOAM libraries. The role of flux limiters schemes namely, Gamma, superBee, van-Albada and van-Leer, is studied in predicting turbulent statistical quantities for a fully developed channel flow with a friction Reynolds number, ReT = 180, and compared the numerical predictions with the well-established Direct Numerical Simulation (DNS) results for studying the wall generated turbulence. It is inferred from the numerical predictions that Gamma, van-Leer and van-Albada limiters produced more diffusion and overpredicted the velocity profiles, while superBee scheme reproduced velocity profiles and turbulence statistical quantities in good agreement with the reference DNS data in the streamwise direction although it deviated slightly in the spanwise and normal to the wall directions. The simulation results are further discussed in terms of the turbulence intensities and Reynolds stresses averaged in time and space to draw conclusion on the flux limiter schemes performance in OpenFOAM context.Keywords: flux limiters, implicit SGS, MILES, OpenFOAM, turbulence statistics
Procedia PDF Downloads 1903052 Navigating through Organizational Change: TAM-Based Manual for Digital Skills and Safety Transitions
Authors: Margarida Porfírio Tomás, Paula Pereira, José Palma Oliveira
Abstract:
Robotic grasping is advancing rapidly, but transferring techniques from rigid to deformable objects remains a challenge. Deformable and flexible items, such as food containers, demand nuanced handling due to their changing shapes. Bridging this gap is crucial for applications in food processing, surgical robotics, and household assistance. AGILEHAND, a Horizon project, focuses on developing advanced technologies for sorting, handling, and packaging soft and deformable products autonomously. These technologies serve as strategic tools to enhance flexibility, agility, and reconfigurability within the production and logistics systems of European manufacturing companies. Key components include intelligent detection, self-adaptive handling, efficient sorting, and agile, rapid reconfiguration. The overarching goal is to optimize work environments and equipment, ensuring both efficiency and safety. As new technologies emerge in the food industry, there will be some implications, such as labour force, safety problems and acceptance of the new technologies. To overcome these implications, AGILEHAND emphasizes the integration of social sciences and humanities, for example, the application of the Technology Acceptance Model (TAM). The project aims to create a change management manual, that will outline strategies for developing digital skills and managing health and safety transitions. It will also provide best practices and models for organizational change. Additionally, AGILEHAND will design effective training programs to enhance employee skills and knowledge. This information will be obtained through a combination of case studies, structured interviews, questionnaires, and a comprehensive literature review. The project will explore how organizations adapt during periods of change and identify factors influencing employee motivation and job satisfaction. This project received funding from European Union’s Horizon 2020/Horizon Europe research and innovation program under grant agreement No101092043 (AGILEHAND).Keywords: change management, technology acceptance model, organizational change, health and safety
Procedia PDF Downloads 453051 The Influence of Nyerere in Integrating Ubuntu Knowledge and Social Work in Tanzania – A Literature Review
Authors: Meinrad Haule Lembuka
Abstract:
Ubuntu is an African philosophy and model with the meaning of 'humanity to others' or 'care for other’s needs because of the guiding principle of interdependence’ that embraces collective and holistic efforts in development through the human face. The study uses a literature review method reflecting Julius Nyerere’s contributions in realizing Ubuntu and social work practice. Nyerere strived to restore Africa development in the lens of humanism through the values of solidarity, communal participation, compassion, care, justice etc; He later founded developmental social work through Ujamaa model, educational for self reliance and African dignity. Nyerere was against post-colonial syndromes through African socialism that envisioned values and principles of social work to provide social justice, human dignity, social change and social development. Also, he managed to serve the primary mission of the social work profession to enhance human wellbeing and help meet basic human needs of all people, with particular attention to the needs and empowerment of people who are vulnerable, oppressed, and living in poverty with African Ubuntu practice of equal distribution of resources. Nyerere further endorsed social work legal framework that embraced universal human rights: service, equality, social justice, and human dignity, Importance of human relationship, integrity and competence. Nyerere proved that Indigenous model can work with formal system like Social work profession. In 2014 the National Heritage Council of South Africa (NHC) honored him an Award of African Ubuntu champion. Nyerere strongly upheld to be an ambassador of social work through his remarkably contributions in developmental social work (Ujamaa model), social change, human dignity, equality, social unity and social justice in Africa and globe at large.Keywords: ubuntu, Indiginious knowledge, Indiginious social work, ubuntu social work
Procedia PDF Downloads 1033050 INCIPIT-CRIS: A Research Information System Combining Linked Data Ontologies and Persistent Identifiers
Authors: David Nogueiras Blanco, Amir Alwash, Arnaud Gaudinat, René Schneider
Abstract:
At a time when the access to and the sharing of information are crucial in the world of research, the use of technologies such as persistent identifiers (PIDs), Current Research Information Systems (CRIS), and ontologies may create platforms for information sharing if they respond to the need of disambiguation of their data by assuring interoperability inside and between other systems. INCIPIT-CRIS is a continuation of the former INCIPIT project, whose goal was to set up an infrastructure for a low-cost attribution of PIDs with high granularity based on Archival Resource Keys (ARKs). INCIPIT-CRIS can be interpreted as a logical consequence and propose a research information management system developed from scratch. The system has been created on and around the Schema.org ontology with a further articulation of the use of ARKs. It is thus built upon the infrastructure previously implemented (i.e., INCIPIT) in order to enhance the persistence of URIs. As a consequence, INCIPIT-CRIS aims to be the hinge between previously separated aspects such as CRIS, ontologies and PIDs in order to produce a powerful system allowing the resolution of disambiguation problems using a combination of an ontology such as Schema.org and unique persistent identifiers such as ARK, allowing the sharing of information through a dedicated platform, but also the interoperability of the system by representing the entirety of the data as RDF triplets. This paper aims to present the implemented solution as well as its simulation in real life. We will describe the underlying ideas and inspirations while going through the logic and the different functionalities implemented and their links with ARKs and Schema.org. Finally, we will discuss the tests performed with our project partner, the Swiss Institute of Bioinformatics (SIB), by the use of large and real-world data sets.Keywords: current research information systems, linked data, ontologies, persistent identifier, schema.org, semantic web
Procedia PDF Downloads 1353049 Uterine Cervical Cancer; Early Treatment Assessment with T2- And Diffusion-Weighted MRI
Authors: Susanne Fridsten, Kristina Hellman, Anders Sundin, Lennart Blomqvist
Abstract:
Background: Patients diagnosed with locally advanced cervical carcinoma are treated with definitive concomitant chemo-radiotherapy. Treatment failure occurs in 30-50% of patients with very poor prognoses. The treatment is standardized with risk for both over-and undertreatment. Consequently, there is a great need for biomarkers able to predict therapy outcomes to allow for individualized treatment. Aim: To explore the role of T2- and diffusion-weighted magnetic resonance imaging (MRI) for early prediction of therapy outcome and the optimal time point for assessment. Methods: A pilot study including 15 patients with cervical carcinoma stage IIB-IIIB (FIGO 2009) undergoing definitive chemoradiotherapy. All patients underwent MRI four times, at baseline, 3 weeks, 5 weeks, and 12 weeks after treatment started. Tumour size, size change (∆size), visibility on diffusion-weighted imaging (DWI), apparent diffusion coefficient (ADC) and change of ADC (∆ADC) at the different time points were recorded. Results: 7/15 patients relapsed during the study period, referred to as "poor prognosis", PP, and the remaining eight patients are referred to "good prognosis", GP. The tumor size was larger at all time points for PP than for GP. The ∆size between any of the four-time points was the same for PP and GP patients. The sensitivity and specificity to predict prognostic group depending on a remaining tumor on DWI were highest at 5 weeks and 83% (5/6) and 63% (5/8), respectively. The combination of tumor size at baseline and remaining tumor on DWI at 5 weeks in ROC analysis reached an area under the curve (AUC) of 0.83. After 12 weeks, no remaining tumor was seen on DWI among patients with GP, as opposed to 2/7 PP patients. Adding ADC to the tumor size measurements did not improve the predictive value at any time point. Conclusion: A large tumor at baseline MRI combined with a remaining tumor on DWI at 5 weeks predicted a poor prognosis.Keywords: chemoradiotherapy, diffusion-weighted imaging, magnetic resonance imaging, uterine cervical carcinoma
Procedia PDF Downloads 1433048 A Preliminary Comparative Study Between the United Kingdom and Taiwan: Public Private Collaboration and Cooperation in Tackling Large Scale Cyberattacks
Authors: Chi-Hsuan Cheng
Abstract:
This research aims to evaluate public-private partnerships against cyberattacks by comparing the UK and Taiwan. First, the study analyses major cyberattacks and factors influencing cybersecurity in both countries. Second, it assesses the effectiveness of current cyber defence strategies in combating cyberattacks by comparing the approaches taken in the UK and Taiwan, while also evaluating the cyber resilience of both nations. Lastly, the research evaluates existing public-private partnerships by comparing those in the UK and Taiwan, and proposes recommendations for enhancing cooperation and collaboration mechanisms in tackling cyberattacks. Grounded theory serves as the core research method. Theoretical sampling is used to recruit participants in both the UK and Taiwan, including investigators, police officers, and professionals from cybersecurity firms. Semi-structured interviews are conducted in English in the UK and Mandarin in Taiwan, recorded with consent, and pseudonymised for privacy. Data analysis involves open coding, grouping excerpts into codes, and categorising codes. Axial coding connects codes into categories, leading to the development of a codebook. The process continues iteratively until theoretical saturation is reached. Finally, selective coding identifies the core topic, evaluating public-private cooperation against cyberattacks and its implications for social and policing strategies in the UK and Taiwan, which highlights the current status of the cybersecurity industry, governmental plans for cybersecurity, and contributions to cybersecurity from both government sectors and cybersecurity firms, with a particular focus on public-private partnerships. In summary, this research aims to offer practical recommendations to law enforcement, private sectors, and academia for reflecting on current strategies and tailoring future approaches in cybersecurityKeywords: cybersecurity, cybercrime, public private partnerships, cyberattack
Procedia PDF Downloads 753047 A Self-Study of the Facilitation of Science Teachers’ Action Research
Authors: Jawaher A. Alsultan, Allen Feldman
Abstract:
With the rapid switch to remote learning due to the COVID-19 pandemic, science teachers were suddenly required to teach their classes online. This breakneck shift to eLearning raised the question of how teacher educators could support science teachers who wanted to use reform-based methods of instruction while using virtual technologies. In this retrospective self-study, we, two science teacher educators, examined our practice as we worked with science teachers to implement inquiry, discussion, and argumentation [IDA] through eLearning. Ten high school science teachers from a large school district in the southeastern US participated virtually in the COVID-19 Community of Practice [COVID-19 CoP]. The CoP met six times from the end of April through May 2020 via Zoom. Its structure was based on a model of action research called enhanced normal practice [ENP], which includes exchanging stories, trying out ideas, and systematic inquiry. Data sources included teacher educators' meeting notes and reflective conversations, audio recordings of the CoP meetings, teachers' products, and post-interviews of the teachers. Findings included a new understanding of the role of existing relationships, shared goals, and similarities in the participants' situations, which helped build trust in the CoP, and the effects of our paying attention to the science teachers’ needs led to a well-functioning CoP. In addition, we became aware of the gaps in our knowledge of how the teachers already used apps in their practice, which they then shared with all of us about how they could be used for online teaching using IDA. We also identified the need to pay attention to feelings about tensions between the teachers and us around the expectations for final products and the project's primary goals. We found that if we are to establish relationships between us as facilitators and teachers that are honest, fair, and kind, we must express those feelings within the collective, dialogical processes that can lead to learning by all members of the CoP, whether virtual or face-to-face.Keywords: community of practice, facilitators, self-study, action research
Procedia PDF Downloads 1263046 Investigation of Poly P-Dioxanone as Promising Biodegradable Polymer for Short-Term Medical Application
Authors: Stefanie Ficht, Lukas Schübel, Magdalena Kleybolte, Markus Eblenkamp, Jana Steger, Dirk Wilhelm, Petra Mela
Abstract:
Although 3D printing as transformative technology has become of increasing interest in the medical field and the demand for biodegradable polymers has developed to a considerable extent, there are only a few additively manufactured, biodegradable implants on the market. Additionally, the sterilization of such implants and its side effects on degradation have still not been sufficiently studied. Within this work, thermosensitive poly p-dioxanone (PPDO) samples were printed with fused filament fabrication (FFF) and investigated. Subsequently, H₂O₂ plasma and gamma radiation were used as low-temperature sterilization techniques and compared among each other and the control group (no sterilization). In order to assess the effect of different sterilization on the degradation behavior of PPDO, the samples were immersed in phosphate-buffered solution (PBS) over 28 days, and surface morphology, thermal properties, molecular weight, inherent viscosity, and mechanical properties were examined at regular time intervals. The study demonstrates that PPDO was printed with great success and that thermal properties, molecular weight (Mw), and inherent viscosity (IV) were not significantly affected by the printing process itself. H₂O₂ plasma sterilization did not significantly harm the thermosensitive polymer, while gamma radiation lowered IV and Mw statistically significantly compared to the control group (p < 0.001). During immersion in PBS, a decrease in Mw and mechanical strength occurred for all samples. However, gamma sterilized samples were affected to a much higher extent compared to the two other sample groups both in final values and timeline. This was confirmed by scanning electron microscopy showing no changes of surface morphology of (non-sterilized) control samples, first microcracks appearing on plasma sterilized samples after two weeks while being present on gamma sterilized samples already immediately after radiation to then further deteriorate over immersion duration. To conclude, we demonstrated that FFF and H₂O₂ plasma sterilization are well suited for processing thermosensitive, biodegradable polymers used for the development of innovative short-term medical applications.Keywords: additive manufacturing, sterilization, biodegradable, thermosensitive, medical application
Procedia PDF Downloads 1213045 The Regulation of Reputational Information in the Sharing Economy
Authors: Emre Bayamlıoğlu
Abstract:
This paper aims to provide an account of the legal and the regulative aspects of the algorithmic reputation systems with a special emphasis on the sharing economy (i.e., Uber, Airbnb, Lyft) business model. The first section starts with an analysis of the legal and commercial nature of the tripartite relationship among the parties, namely, the host platform, individual sharers/service providers and the consumers/users. The section further examines to what extent an algorithmic system of reputational information could serve as an alternative to legal regulation. Shortcomings are explained and analyzed with specific examples from Airbnb Platform which is a pioneering success in the sharing economy. The following section focuses on the issue of governance and control of the reputational information. The section first analyzes the legal consequences of algorithmic filtering systems to detect undesired comments and how a delicate balance could be struck between the competing interests such as freedom of speech, privacy and the integrity of the commercial reputation. The third section deals with the problem of manipulation by users. Indeed many sharing economy businesses employ certain techniques of data mining and natural language processing to verify consistency of the feedback. Software agents referred as "bots" are employed by the users to "produce" fake reputation values. Such automated techniques are deceptive with significant negative effects for undermining the trust upon which the reputational system is built. The third section is devoted to explore the concerns with regard to data mobility, data ownership, and the privacy. Reputational information provided by the consumers in the form of textual comment may be regarded as a writing which is eligible to copyright protection. Algorithmic reputational systems also contain personal data pertaining both the individual entrepreneurs and the consumers. The final section starts with an overview of the notion of reputation as a communitarian and collective form of referential trust and further provides an evaluation of the above legal arguments from the perspective of public interest in the integrity of reputational information. The paper concludes with certain guidelines and design principles for algorithmic reputation systems, to address the above raised legal implications.Keywords: sharing economy, design principles of algorithmic regulation, reputational systems, personal data protection, privacy
Procedia PDF Downloads 4653044 Synthesis of Low-Cost Porous Silicon Carbide Foams from Renewable Sources
Authors: M. A. Bayona, E. M. Cordoba, V. R. Guiza
Abstract:
Highly porous carbon-based foams are used in a wide range of industrial applications, which include absorption, catalyst supports, thermal insulation, and biomaterials, among others. Particularly, silicon carbide (SiC) based foams have shown exceptional potential for catalyst support applications, due to their chemical inertness, large frontal area, low resistance to flow, low-pressure drop, as well as high resistance to temperature and corrosion. These properties allow the use of SiC foams in harsh environments with high durability. Commonly, SiC foams are fabricated from polysiloxane, SiC powders and phenolic resins, which can be costly or highly toxic to the environment. In this work, we propose a low-cost method for the fabrication of highly porous, three-dimensional SiC foams via template replica, using recycled polymeric sponges as sacrificial templates. A sucrose-based resin combined with a Si-containing pre-ceramic polymer was used as the precursor. Polymeric templates were impregnated with the precursor solution, followed by thermal treatment at 1500 °C under an inert atmosphere. Several synthesis parameters, such as viscosity and composition of the precursor solution (Si: Sucrose molar ratio), and the porosity of the template, were evaluated in terms of their effect on the morphology, composition and mechanical resistance of the resulting SiC foams. The synthesized composite foams exhibited a highly porous (50-90%) and interconnected structure, containing 30-90% SiC with a mechanical compressive strength between 0.01-0.1 MPa. The methodology employed here allowed the fabrication of foams with a varied concentration of SiC and with morphological and mechanical properties that contribute to the development of materials of high relevance in the industry, while using low-cost, renewable sources such as table sugar, and providing a recycling alternative for polymeric sponges.Keywords: catalyst support, polymer replica technique, reticulated porous ceramics, silicon carbide
Procedia PDF Downloads 1233043 A Data-Driven Agent Based Model for the Italian Economy
Authors: Michele Catalano, Jacopo Di Domenico, Luca Riccetti, Andrea Teglio
Abstract:
We develop a data-driven agent based model (ABM) for the Italian economy. We calibrate the model for the initial condition and parameters. As a preliminary step, we replicate the Monte-Carlo simulation for the Austrian economy. Then, we evaluate the dynamic properties of the model: the long-run equilibrium and the allocative efficiency in terms of disequilibrium patterns arising in the search and matching process for final goods, capital, intermediate goods, and credit markets. In this perspective, we use a randomized initial condition approach. We perform a robustness analysis perturbing the system for different parameter setups. We explore the empirical properties of the model using a rolling window forecast exercise from 2010 to 2022 to observe the model’s forecasting ability in the wake of the COVID-19 pandemic. We perform an analysis of the properties of the model with a different number of agents, that is, with different scales of the model compared to the real economy. The model generally displays transient dynamics that properly fit macroeconomic data regarding forecasting ability. We stress the model with a large set of shocks, namely interest policy, fiscal policy, and exogenous factors, such as external foreign demand for export. In this way, we can explore the most exposed sectors of the economy. Finally, we modify the technology mix of the various sectors and, consequently, the underlying input-output sectoral interdependence to stress the economy and observe the long-run projections. In this way, we can include in the model the generation of endogenous crisis due to the implied structural change, technological unemployment, and potential lack of aggregate demand creating the condition for cyclical endogenous crises reproduced in this artificial economy.Keywords: agent-based models, behavioral macro, macroeconomic forecasting, micro data
Procedia PDF Downloads 693042 Object-Scene: Deep Convolutional Representation for Scene Classification
Authors: Yanjun Chen, Chuanping Hu, Jie Shao, Lin Mei, Chongyang Zhang
Abstract:
Traditional image classification is based on encoding scheme (e.g. Fisher Vector, Vector of Locally Aggregated Descriptor) with low-level image features (e.g. SIFT, HoG). Compared to these low-level local features, deep convolutional features obtained at the mid-level layer of convolutional neural networks (CNN) have richer information but lack of geometric invariance. For scene classification, there are scattered objects with different size, category, layout, number and so on. It is crucial to find the distinctive objects in scene as well as their co-occurrence relationship. In this paper, we propose a method to take advantage of both deep convolutional features and the traditional encoding scheme while taking object-centric and scene-centric information into consideration. First, to exploit the object-centric and scene-centric information, two CNNs that trained on ImageNet and Places dataset separately are used as the pre-trained models to extract deep convolutional features at multiple scales. This produces dense local activations. By analyzing the performance of different CNNs at multiple scales, it is found that each CNN works better in different scale ranges. A scale-wise CNN adaption is reasonable since objects in scene are at its own specific scale. Second, a fisher kernel is applied to aggregate a global representation at each scale and then to merge into a single vector by using a post-processing method called scale-wise normalization. The essence of Fisher Vector lies on the accumulation of the first and second order differences. Hence, the scale-wise normalization followed by average pooling would balance the influence of each scale since different amount of features are extracted. Third, the Fisher vector representation based on the deep convolutional features is followed by a linear Supported Vector Machine, which is a simple yet efficient way to classify the scene categories. Experimental results show that the scale-specific feature extraction and normalization with CNNs trained on object-centric and scene-centric datasets can boost the results from 74.03% up to 79.43% on MIT Indoor67 when only two scales are used (compared to results at single scale). The result is comparable to state-of-art performance which proves that the representation can be applied to other visual recognition tasks.Keywords: deep convolutional features, Fisher Vector, multiple scales, scale-specific normalization
Procedia PDF Downloads 3313041 Effect of Doping on Band Gap of Zinc Oxide and Degradation of Methylene Blue and Industrial Effluent
Authors: V. P. Borker, K. S. Rane, A. J. Bhobe, R. S. Karmali
Abstract:
Effluent of dye industries contains chemicals and organic dyes. Sometimes they are thrown in the water bodies without any treatment. This leads to environmental pollution and is detrimental to flora and fauna. Semiconducting oxide zinc oxide with wide bandgap 3.37 eV is used as a photocatalyst in degrading organic dyes using UV radiations. It generates electron-hole pair on exposure to UV light. If degradation is aimed at solar radiations, bandgap of zinc oxide is to be reduced so as to utilize visible radiation. Thus, in present study, zinc oxide, ZnO is synthesized from zinc oxalate, N doped zinc oxide, ZnO₁₋ₓNₓ from hydrazinated zinc oxalate, cadmium doped zinc oxide Zn₀.₉Cd₀.₁₀ and magnesium-doped zinc oxide Zn₀.₉Mg₀.₁₀ from mixed metal oxalate and hydrazinated mixed metal oxalate. The precursors were characterized by FTIR. They were decomposed to form oxides and XRD were recorded. The compounds were monophasic. Bandgap was calculated using Diffuse Reflectance Spectrum. The bandgap of ZnO was reduced to 3.24 because of precursor method of synthesis leading large surface area. The bandgap of Zn₀.₉Cd₀.₁₀ was 3.11 eV and that of Zn₀.₉Mg₀.₁₀ 3.41 eV. The lowest value was of ZnO₁₋ₓNₓ 3.09 eV. These oxides were used to degrade methylene blue, a model dye in sunlight. ZnO₁₋ₓNₓ was also used to degrade effluent of industry manufacturing colours, crayons and markers. It was observed that ZnO₁₋ₓNₓ acts as a good photocatalyst for degradation of methylene blue. It can degrade the solution within 120 minutes. Similarly, diluted effluent was decolourised using this oxide. Some colours were degraded using ZnO. Thus, the use of these two oxides could mineralize effluent. Lesser bandgap leads to more electro hole pair thus helps in the formation of hydroxyl ion radicals. These radicals attack the dye molecule, fragmentation takes place and it is mineralised.Keywords: cadmium doped zinc oxide, dye degradation, dye effluent degradation, N doped zinc oxide, zinc oxide
Procedia PDF Downloads 1683040 Experimental Study of Complete Loss of Coolant Flow (CLOF) Test by System–Integrated Modular Advanced Reactor Integral Test Loop (SMART-ITL) with Passive Residual Heat Removal System (PRHRS)
Authors: Jin Hwa Yang, Hwang Bae, Sung Uk Ryu, Byong Guk Jeon, Sung Jae Yi, Hyun Sik Park
Abstract:
Experimental studies using a large-scale thermal-hydraulic integral test facility, System–integrated Modular Advanced Reactor Integral Test Loop (SMART-ITL), have been carried out to validate the performance of the prototype, SMART. After Fukushima accident, the passive safety systems have been dealt as important designs for retaining of nuclear safety. One of the concerned scenarios for evaluating the passive safety system is a Complete Loss of Coolant Flow (CLOF). The flowrate of coolant in the primary system is maintained by Reactor Coolant Pump (RCP). When the supply of electric power of RCP is shut off, the flowrate of coolant decreases sharply, and the temperature of the coolant increases rapidly. Therefore, the reactor trip signal is activated to prevent the over-heating of the core. In this situation, Passive Residual Heat Removal System (PRHRS) plays a significant role to assure the soundness of the SMART. The PRHRS using a two-phase natural circulation is a passive safety system in the SMART to eliminate the heat of steam generator in the secondary system with heat exchanger submarined in the Emergency Cooling Tank (ECT). As the RCPs continue to coast down, inherent natural circulation in the primary system transfers heat to the secondary system. The transferred heat is removed by PRHRS in the secondary system. In this paper, the progress of the CLOF accident is described with experimental data of transient condition performed by SMART-ITL. Finally, the capability of passive safety system and inherent natural circulation will be evaluated.Keywords: CLOF, natural circulation, PRHRS, SMART-ITL
Procedia PDF Downloads 4383039 Effectiveness of Cold Calling on Students’ Behavior and Participation during Class Discussions: Punishment or Opportunity to Shine
Authors: Maimuna Akram, Khadija Zia, Sohaib Naseer
Abstract:
Pedagogical objectives and the nature of the course content may lead instructors to take varied approaches to selecting a student for the cold call, specifically in a studio setup where students work on different projects independently and show progress work time to time at scheduled critiques. Cold-calling often proves to be an effective tool in eliciting a response without enforcing judgment onto the recipients. While there is a mixed range of behavior exhibited by students who are cold-called, a classification of responses from anxiety-provoking to inspiring may be elicited; there is a need for a greater understanding of utilizing the exchanges in bringing about fruitful and engaging outcomes of studio discussions. This study aims to unravel the dimensions of utilizing the cold-call approach in a didactic exchange within studio pedagogy. A questionnaire survey was conducted in an undergraduate class at Arts and Design School. The impact of cold calling on students’ participation was determined through various parameters, including course choice, participation frequency, students’ comfortability, and teaching methodology. After analyzing the surveys, specific classroom teachers were interviewed to provide a qualitative perspective of the faculty. It was concluded that cold-calling increases students’ participation frequency and also increases preparation for class. Around 67% of students responded that teaching methods play an important role in learning activities and students’ participation during class discussions. 84% of participants agreed that cold calling is an effective way of learning. According to research, cold-calling can be done in large numbers without making students uncomfortable. As a result, the findings of this study support the use of this instructional method to encourage more students to participate in class discussions.Keywords: active learning, class discussion, class participation, cold calling, pedagogical methods, student engagement
Procedia PDF Downloads 373038 Improved Regression Relations Between Different Magnitude Types and the Moment Magnitude in the Western Balkan Earthquake Catalogue
Authors: Anila Xhahysa, Migena Ceyhan, Neki Kuka, Klajdi Qoshi, Damiano Koxhaj
Abstract:
The seismic event catalog has been updated in the framework of a bilateral project supported by the Central European Investment Fund and with the extensive support of Global Earthquake Model Foundation to update Albania's national seismic hazard model. The earthquake catalogue prepared within this project covers the Western Balkan area limited by 38.0° - 48°N, 12.5° - 24.5°E and includes 41,806 earthquakes that occurred in the region between 510 BC and 2022. Since the moment magnitude characterizes the earthquake size accurately and the selected ground motion prediction equations for the seismic hazard assessment employ this scale, it was chosen as the uniform magnitude scale for the catalogue. Therefore, proxy values of moment magnitude had to be obtained by using new magnitude conversion equations between the local and other magnitude types to this unified scale. The Global Centroid Moment Tensor Catalogue was considered the most authoritative for moderate to large earthquakes for moment magnitude reports; hence it was used as a reference for calibrating other sources. The best fit was observed when compared to some regional agencies, whereas, with reports of moment magnitudes from Italy, Greece and Turkey, differences were observed in all magnitude ranges. For teleseismic magnitudes, to account for the non-linearity of the relationships, we used the exponential model for the derivation of the regression equations. The obtained regressions for the surface wave magnitude and short-period body-wave magnitude show considerable differences with Global Earthquake Model regression curves, especially for low magnitude ranges. Moreover, a conversion relation was obtained between the local magnitude of Albania and the corresponding moment magnitude as reported by the global and regional agencies. As errors were present in both variables, the Deming regression was used.Keywords: regression, seismic catalogue, local magnitude, tele-seismic magnitude, moment magnitude
Procedia PDF Downloads 703037 Development of Electric Generator and Water Purifier Cart
Authors: Luisito L. Lacatan, Gian Carlo J. Bergonia, Felipe C. Buado III, Gerald L. Gono, Ron Mark V. Ortil, Calvin A. Yap
Abstract:
This paper features the development of a Mobile Self-sustaining Electricity Generator for water distillation process with MCU- based wireless controller & indicator designed to solve the problem of scarcity of clean water. It is a fact that pure water is precious nowadays and its value is more precious to those who do not have or enjoy it. There are many water filtration products in existence today. However, none of these products fully satisfies the needs of families needing clean drinking water. All of the following products require either large sums of money or extensive maintenance, and some products do not even come with a guarantee of potable water. The proposed project was designed to alleviate the problem of scarcity of potable water in the country and part of the purpose was also to identify the problem or loopholes of the project such as the distance and speed required to produce electricity using a wheel and alternator, the required time for the heating element to heat up, the capacity of the battery to maintain the heat of the heating element and the time required for the boiler to produce a clean and potable water. The project has three parts. The first part included the researchers’ effort to plan every part of the project from the conversion of mechanical energy to electrical energy, from purifying water to potable drinking water to the controller and indicator of the project using microcontroller unit (MCU). This included identifying the problem encountered and any possible solution to prevent and avoid errors. Gathering and reviewing related studies about the project helped the researcher reduce and prevent any problems before they could be encountered. It also included the price and quantity of materials used to control the budget.Keywords: mobile, self – sustaining, electricity generator, water distillation, wireless battery indicator, wireless water level indicator
Procedia PDF Downloads 3103036 The Nexus Between the Rise of Autocratisation and the Deeper Level of BRI Engagement
Authors: Dishari Rakshit, Mitchell Gallagher
Abstract:
The global landscape is witnessing a disconcerting surge in democratic backsliding, engendering concerns over the rise of autocratisation. This research demonstrates the intricate relationship between a nation's domestic propensity for autocratic governance and its trade relations with China. Giving prominence to Belt and Road Initiative (BRI) investments, this study adopts a rigorous neorealist framework to discern the complexities of nations' economic interests amidst an anarchic milieu and how these interests may transcend steadfast adherence to democratic principles. The burgeoning bipolarity in the international political setting serves as a backdrop to our inquiry. To operationalise our hypothesis, we conduct a large-scale 'N' study, encompassing a comprehensive global dataset comprising countries' democracy indicators, total trade volume with China, and cumulative Chinese BRI investments over a substantial temporal expanse. By meticulously examining BRI signatories’, we aim to ascertain the potential accentuation of democratic backsliding among these nations. To test our empirical underpinning, we will validate our findings through cogent case studies. Our analysis adds to the scholarship on multifaceted interactions between trade dynamics and democratic governance within the fabric of the international political landscape. In its culmination, the paper addresses the question- has the erstwhile grandeur of bipolarity resurfaced in the contemporary global panorama? Concurrently, we explore the nexus between the ascendant wave of autocratisation as a by-product of the Beijing Consensus? Pertinent to policymakers, our discoveries stand poised to furnish a comprehensive grasp of the manifold implications arising from the deepening entanglements with China under the auspices of the BRI.Keywords: democracy, autocracy, china, belt road initiative, international political economy
Procedia PDF Downloads 713035 Identifying Strategies and Techniques for the Egyptian Medium and Large Size Contractors to Respond to Economic Hardship
Authors: Michael Salib, Samer Ezeldin, Ahmed Waly
Abstract:
There are numerous challenges and problems facing the construction industry in several countries in the Middle East, as a result of numerous economic and political effects. As an example in Egypt, several construction companies have shut down and left the market since 2016. The closure of these companies occurred, as they did not respond with the suitable techniques and strategies that will enable them to survive during this economic turmoil period. A research is conducted in order to identify adequate strategies to be implemented by the Egyptian contractors that could allow them survive and keep competing during such economic hardship period. Two different techniques were used in order to identify these startegies. First, a deep research were conducted on the companies located in countries that suffered similar economic harship to identify the strategies they used in order to survive. Second, interviews were conducted with experts in the construction field in order to list the effective strategies they used that allowed them to survive. Moreover, at the end of each interview, the experts were asked to rate the applicability of the previously identified strategies used in the foreign countries, then the efficiency of each strategy if used in Egypt. A framework model is developed in order to assist the construction companies in choosing the suitable techniques to their company size, through identifying the top ranked strategies and techniques that should be adopted by the company based on the parameters given to the model. In order to verify this framework, the financial statements of two leading companies in the Egyptian construction market were studied. The first Contractor has applied nearly all the top ranked strategies identified in this paper, while the other contractor has applied only few of the identified top ranked strategies. Finally, another expert interviews were conducted in order to validate the framework. These experts were asked to test the model and rate through a questionnaire its applicability and effectiveness.Keywords: construction management, economic hardship, recession, survive
Procedia PDF Downloads 1263034 A User Interface for Easiest Way Image Encryption with Chaos
Authors: D. López-Mancilla, J. M. Roblero-Villa
Abstract:
Since 1990, the research on chaotic dynamics has received considerable attention, particularly in light of potential applications of this phenomenon in secure communications. Data encryption using chaotic systems was reported in the 90's as a new approach for signal encoding that differs from the conventional methods that use numerical algorithms as the encryption key. The algorithms for image encryption have received a lot of attention because of the need to find security on image transmission in real time over the internet and wireless networks. Known algorithms for image encryption, like the standard of data encryption (DES), have the drawback of low level of efficiency when the image is large. The encrypting based on chaos proposes a new and efficient way to get a fast and highly secure image encryption. In this work, a user interface for image encryption and a novel and easiest way to encrypt images using chaos are presented. The main idea is to reshape any image into a n-dimensional vector and combine it with vector extracted from a chaotic system, in such a way that the vector image can be hidden within the chaotic vector. Once this is done, an array is formed with the original dimensions of the image and turns again. An analysis of the security of encryption from the images using statistical analysis is made and is used a stage of optimization for image encryption security and, at the same time, the image can be accurately recovered. The user interface uses the algorithms designed for the encryption of images, allowing you to read an image from the hard drive or another external device. The user interface, encrypt the image allowing three modes of encryption. These modes are given by three different chaotic systems that the user can choose. Once encrypted image, is possible to observe the safety analysis and save it on the hard disk. The main results of this study show that this simple method of encryption, using the optimization stage, allows an encryption security, competitive with complicated encryption methods used in other works. In addition, the user interface allows encrypting image with chaos, and to submit it through any public communication channel, including internet.Keywords: image encryption, chaos, secure communications, user interface
Procedia PDF Downloads 4903033 Evaluation the Influence of Trunk Bracing in Joint Contact Forces in Subjects with Scoliosis
Authors: Azadeh Jafari, Mohammad Taghi Karimi, Azadeh Nadi
Abstract:
Background: Scoliosis is the lateral curvature of the spine which may influence the abilities of the subjects during standing and walking. Most of the scoliotic subjects use orthosis to reduce the curve and to decrease the risk of curve progression. There was lack of information regarding the effects of orthosis on kinematic and joint contact force. Therefore, this research was done to highlight the effects of orthosis on the aforementioned parameters. Method: 5 scoliotic subjects were recruited in this study, with single curve less than 40 (females with age 13.2 ± 1.7). They were asked to walk with and without orthosis. The kinematic of the joints, force applied on the legs, moments transmitted through the joints and joint contact forces were evaluated in this study. Moreover, the lengths of muscles were determined by use of computer muscle control approach in OpenSim. Results: There was a significant difference between the second peak of vertical ground reaction force while walking with and without orthosis (p-value < 0.05). There was no difference between spatiotemporal gait parameters while walking with and without orthosis (P-value > 0.05). The mean values of joint contact forces (vertical component) increased by the use of orthosis, but the difference was not significant (p-value > 0.05). Conclusion: Although the kinematic of most of the body joints was not influenced by the use of orthosis, the joint contact force may be increased by orthosis. The increase in joint contact force may be due to the performance of orthosis which restricts the motions of pelvic and increases compensatory mechanism used by the subjects to decrease the side effects of the orthosis.Keywords: scoliosis, joint contact force, kinetic, kinematic
Procedia PDF Downloads 210