The search functionality is under construction.

Keyword Search Result

[Keyword] internet of things(57hit)

1-20hit(57hit)

  • Real-Time Monitoring Systems That Provide M2M Communication between Machines Open Access

    Ya ZHONG  

     
    PAPER-Language, Thought, Knowledge and Intelligence

      Pubricized:
    2023/10/17
      Vol:
    E107-A No:7
      Page(s):
    1019-1026

    Artificial intelligence and the introduction of Internet of Things technologies have benefited from technological advances and new automated computer system technologies. Eventually, it is now possible to integrate them into a single offline industrial system. This is accomplished through machine-to-machine communication, which eliminates the human factor. The purpose of this article is to examine security systems for machine-to-machine communication systems that rely on identification and authentication algorithms for real-time monitoring. The article investigates security methods for quickly resolving data processing issues by using the Security operations Center’s main machine to identify and authenticate devices from 19 different machines. The results indicate that when machines are running offline and performing various tasks, they can be exposed to data leaks and malware attacks by both the individual machine and the system as a whole. The study looks at the operation of 19 computers, 7 of which were subjected to data leakage and malware attacks. AnyLogic software is used to create visual representations of the results using wireless networks and algorithms based on previously processed methods. The W76S is used as a protective element within intelligent sensors due to its built-in memory protection. For 4 machines, the data leakage time with malware attacks was 70 s. For 10 machines, the duration was 150 s with 3 attacks. Machine 15 had the longest attack duration, lasting 190 s and involving 6 malware attacks, while machine 19 had the shortest attack duration, lasting 200 s and involving 7 malware attacks. The highest numbers indicated that attempting to hack a system increased the risk of damaging a device, potentially resulting in the entire system with connected devices failing. Thus, illegal attacks by attackers using malware may be identified over time, and data processing effects can be prevented by intelligent control. The results reveal that applying identification and authentication methods using a protocol increases cyber-physical system security while also allowing real-time monitoring of offline system security.

  • Thermoelectric Effect of Ga-Sn-O Thin Films for Internet-of-Things Application

    Yuhei YAMAMOTO  Naoki SHIBATA  Tokiyoshi MATSUDA  Hidenori KAWANISHI  Mutsumi KIMURA  

     
    BRIEF PAPER-Electronic Materials

      Pubricized:
    2023/07/10
      Vol:
    E107-C No:1
      Page(s):
    18-21

    Thermoelectric effect of Ga-Sn-O (GTO) thin films has been investigated for Internet-of-Things application. It is found that the amorphous GTO thin films provide higher power factors (PF) than the polycrystalline ones, which is because grain boundaries block the electron conduction in the polycrystalline ones. It is also found that the GTO thin films annealed in vacuum provide higher PF than those annealed in air, which is because oxygen vacancies are terminated in those annealed in air. The PF and dimensionless figure of merit (ZT) is not so excellent, but the cost effectiveness is excellent, which is the most important for some examples of the Internet-of-Things application.

  • Content Search Method Utilizing the Metadata Matching Characteristics of Both Spatio-Temporal Content and User Request in the IoT Era

    Shota AKIYOSHI  Yuzo TAENAKA  Kazuya TSUKAMOTO  Myung LEE  

     
    PAPER-Network System

      Pubricized:
    2023/10/06
      Vol:
    E107-B No:1
      Page(s):
    163-172

    Cross-domain data fusion is becoming a key driver in the growth of numerous and diverse applications in the Internet of Things (IoT) era. We have proposed the concept of a new information platform, Geo-Centric Information Platform (GCIP), that enables IoT data fusion based on geolocation, i.e., produces spatio-temporal content (STC), and then provides the STC to users. In this environment, users cannot know in advance “when,” “where,” or “what type” of STC is being generated because the type and timing of STC generation vary dynamically with the diversity of IoT data generated in each geographical area. This makes it difficult to directly search for a specific STC requested by the user using the content identifier (domain name of URI or content name). To solve this problem, a new content discovery method that does not directly specify content identifiers is needed while taking into account (1) spatial and (2) temporal constraints. In our previous study, we proposed a content discovery method that considers only spatial constraints and did not consider temporal constraints. This paper proposes a new content discovery method that matches user requests with content metadata (topic) characteristics while taking into account spatial and temporal constraints. Simulation results show that the proposed method successfully discovers appropriate STC in response to a user request.

  • Nonvolatile Storage Cells Using FiCC for IoT Processors with Intermittent Operations

    Yuki ABE  Kazutoshi KOBAYASHI  Jun SHIOMI  Hiroyuki OCHI  

     
    PAPER

      Pubricized:
    2023/04/13
      Vol:
    E106-C No:10
      Page(s):
    546-555

    Energy harvesting has been widely investigated as a potential solution to supply power for Internet of Things (IoT) devices. Computing devices must operate intermittently rather than continuously, because harvested energy is unstable and some of IoT applications can be periodic. Therefore, processors for IoT devices with intermittent operation must feature a hibernation mode with zero-standby-power in addition to energy-efficient normal mode. In this paper, we describe the layout design and measurement results of a nonvolatile standard cell memory (NV-SCM) and nonvolatile flip-flops (NV-FF) with a nonvolatile memory using Fishbone-in-Cage Capacitor (FiCC) suitable for IoT processors with intermittent operations. They can be fabricated in any conventional CMOS process without any additional mask. NV-SCM and NV-FF are fabricated in a 180nm CMOS process technology. The area overhead by nonvolatility of a bit cell are 74% in NV-SCM and 29% in NV-FF, respectively. We confirmed full functionality of the NV-SCM and NV-FF. The nonvolatile system using proposed NV-SCM and NV-FF can reduce the energy consumption by 24.3% compared to the volatile system when hibernation/normal operation time ratio is 500 as shown in the simulation.

  • Few-Shot Learning-Based Malicious IoT Traffic Detection with Prototypical Graph Neural Networks

    Thin Tharaphe THEIN  Yoshiaki SHIRAISHI  Masakatu MORII  

     
    PAPER

      Pubricized:
    2023/06/22
      Vol:
    E106-D No:9
      Page(s):
    1480-1489

    With a rapidly escalating number of sophisticated cyber-attacks, protecting Internet of Things (IoT) networks against unauthorized activity is a major concern. The detection of malicious attack traffic is thus crucial for IoT security to prevent unwanted traffic. However, existing traditional malicious traffic detection systems which relied on supervised machine learning approach need a considerable number of benign and malware traffic samples to train the machine learning models. Moreover, in the cases of zero-day attacks, only a few labeled traffic samples are accessible for analysis. To deal with this, we propose a few-shot malicious IoT traffic detection system with a prototypical graph neural network. The proposed approach does not require prior knowledge of network payload binaries or network traffic signatures. The model is trained on labeled traffic data and tested to evaluate its ability to detect new types of attacks when only a few labeled traffic samples are available. The proposed detection system first categorizes the network traffic as a bidirectional flow and visualizes the binary traffic flow as a color image. A neural network is then applied to the visualized traffic to extract important features. After that, using the proposed few-shot graph neural network approach, the model is trained on different few-shot tasks to generalize it to new unseen attacks. The proposed model is evaluated on a network traffic dataset consisting of benign traffic and traffic corresponding to six types of attacks. The results revealed that our proposed model achieved an F1 score of 0.91 and 0.94 in 5-shot and 10-shot classification, respectively, and outperformed the baseline models.

  • Intrusion Detection Model of Internet of Things Based on LightGBM Open Access

    Guosheng ZHAO  Yang WANG  Jian WANG  

     
    PAPER-Fundamental Theories for Communications

      Pubricized:
    2023/02/20
      Vol:
    E106-B No:8
      Page(s):
    622-634

    Internet of Things (IoT) devices are widely used in various fields. However, their limited computing resources make them extremely vulnerable and difficult to be effectively protected. Traditional intrusion detection systems (IDS) focus on high accuracy and low false alarm rate (FAR), making them often have too high spatiotemporal complexity to be deployed in IoT devices. In response to the above problems, this paper proposes an intrusion detection model of IoT based on the light gradient boosting machine (LightGBM). Firstly, the one-dimensional convolutional neural network (CNN) is used to extract features from network traffic to reduce the feature dimensions. Then, the LightGBM is used for classification to detect the type of network traffic belongs. The LightGBM is more lightweight on the basis of inheriting the advantages of the gradient boosting tree. The LightGBM has a faster decision tree construction process. Experiments on the TON-IoT and BoT-IoT datasets show that the proposed model has stronger performance and more lightweight than the comparison models. The proposed model can shorten the prediction time by 90.66% and is better than the comparison models in accuracy and other performance metrics. The proposed model has strong detection capability for denial of service (DoS) and distributed denial of service (DDoS) attacks. Experimental results on the testbed built with IoT devices such as Raspberry Pi show that the proposed model can perform effective and real-time intrusion detection on IoT devices.

  • Edge Computing Resource Allocation Algorithm for NB-IoT Based on Deep Reinforcement Learning

    Jiawen CHU  Chunyun PAN  Yafei WANG  Xiang YUN  Xuehua LI  

     
    PAPER-Network

      Pubricized:
    2022/11/04
      Vol:
    E106-B No:5
      Page(s):
    439-447

    Mobile edge computing (MEC) technology guarantees the privacy and security of large-scale data in the Narrowband-IoT (NB-IoT) by deploying MEC servers near base stations to provide sufficient computing, storage, and data processing capacity to meet the delay and energy consumption requirements of NB-IoT terminal equipment. For the NB-IoT MEC system, this paper proposes a resource allocation algorithm based on deep reinforcement learning to optimize the total cost of task offloading and execution. Since the formulated problem is a mixed-integer non-linear programming (MINLP), we cast our problem as a multi-agent distributed deep reinforcement learning (DRL) problem and address it using dueling Q-learning network algorithm. Simulation results show that compared with the deep Q-learning network and the all-local cost and all-offload cost algorithms, the proposed algorithm can effectively guarantee the success rates of task offloading and execution. In addition, when the execution task volume is 200KBit, the total system cost of the proposed algorithm can be reduced by at least 1.3%, and when the execution task volume is 600KBit, the total cost of system execution tasks can be reduced by 16.7% at most.

  • Performance Analysis of Mobile Cellular Networks Accommodating Cellular-IoT Communications with Immediate Release of Radio Resources

    Shuya ABE  Go HASEGAWA  Masayuki MURATA  

     
    PAPER-Network

      Pubricized:
    2022/06/20
      Vol:
    E105-B No:12
      Page(s):
    1477-1486

    It is now becoming important for mobile cellular networks to accommodate all kinds of Internet of Things (IoT) communications. However, the contention-based random access and radio resource allocation used in traditional cellular networks, which are optimized mainly for human communications, cannot efficiently handle large-scale IoT communications. For this reason, standardization activities have emerged to serve IoT devices such as Cellular-IoT (C-IoT). However, few studies have been directed at evaluating the performance of C-IoT communications with periodic data transmissions, despite this being a common characteristic of many IoT communications. In this paper, we give the performance analysis results of mobile cellular networks supporting periodic C-IoT communications, focusing on the performance differences between LTE and Narrowband-IoT (NB-IoT) networks. To achieve this, we first construct an analysis model for end-to-end performance of both the control plane and data plane, including random access procedures, radio resource allocation, establishing bearers in the Evolved Packet Core network, and user-data transmissions. In addition, we include the impact of the immediate release of the radio resources proposed in 3GPP. Numerical evaluations show that NB-IoT can support more IoT devices than LTE, up to 8.7 times more, but imposes a significant delay in data transmissions. We also confirm that the immediate release of radio resources increases the network capacity by up to 17.7 times.

  • Sensitivity Enhanced Edge-Cloud Collaborative Trust Evaluation in Social Internet of Things

    Peng YANG  Yu YANG  Puning ZHANG  Dapeng WU  Ruyan WANG  

     
    PAPER-Network Management/Operation

      Pubricized:
    2022/03/22
      Vol:
    E105-B No:9
      Page(s):
    1053-1062

    The integration of social networking concepts into the Internet of Things has led to the Social Internet of Things (SIoT) paradigm, and trust evaluation is essential to secure interaction in SIoT. In SIoT, when resource-constrained nodes respond to unexpected malicious services and malicious recommendations, the trust assessment is prone to be inaccurate, and the existing architecture has the risk of privacy leakage. An edge-cloud collaborative trust evaluation architecture in SIoT is proposed in this paper. Utilize the resource advantages of the cloud and the edge to complete the trust assessment task collaboratively. An evaluation algorithm of relationship closeness between nodes is designed to evaluate neighbor nodes' reliability in SIoT. A trust computing algorithm with enhanced sensitivity is proposed, considering the fluctuation of trust value and the conflict between trust indicators to enhance the sensitivity of identifying malicious behaviors. Simulation results show that compared with traditional methods, the proposed trust evaluation method can effectively improve the success rate of interaction and reduce the false detection rate when dealing with malicious services and malicious recommendations.

  • PRIGM: Partial-Regression-Integrated Generic Model for Synthetic Benchmarks Robust to Sensor Characteristics

    Kyungmin KIM  Jiung SONG  Jong Wook KWAK  

     
    LETTER-Data Engineering, Web Information Systems

      Pubricized:
    2022/04/04
      Vol:
    E105-D No:7
      Page(s):
    1330-1334

    We propose a novel synthetic-benchmarks generation model using partial time-series regression, called Partial-Regression-Integrated Generic Model (PRIGM). PRIGM abstracts the unique characteristics of the input sensor data into generic time-series data confirming the generation similarity and evaluating the correctness of the synthetic benchmarks. The experimental results obtained by the proposed model with its formula verify that PRIGM preserves the time-series characteristics of empirical data in complex time-series data within 10.4% on an average difference in terms of descriptive statistics accuracy.

  • Deep Coalitional Q-Learning for Dynamic Coalition Formation in Edge Computing

    Shiyao DING  Donghui LIN  

     
    PAPER

      Pubricized:
    2021/12/14
      Vol:
    E105-D No:5
      Page(s):
    864-872

    With the high development of computation requirements in Internet of Things, resource-limited edge servers usually require to cooperate to perform the tasks. Most related studies usually assume a static cooperation approach which might not suit the dynamic environment of edge computing. In this paper, we consider a dynamic cooperation approach by guiding edge servers to form coalitions dynamically. It raises two issues: 1) how to guide them to optimally form coalitions and 2) how to cope with the dynamic feature where server statuses dynamically change as the tasks are performed. The coalitional Markov decision process (CMDP) model proposed in our previous work can handle these issues well. However, its basic solution, coalitional Q-learning, cannot handle the large scale problem when the task number is large in edge computing. Our response is to propose a novel algorithm called deep coalitional Q-learning (DCQL) to solve it. To sum up, we first formulate the dynamic cooperation problem of edge servers as a CMDP: each edge server is regarded as an agent and the dynamic process is modeled as a MDP where the agents observe the current state to formulate several coalitions. Each coalition takes an action to impact the environment which correspondingly transfers to the next state to repeat the above process. Then, we propose DCQL which includes a deep neural network and so can well cope with large scale problem. DCQL can guide the edge servers to form coalitions dynamically with the target of optimizing some goal. Furthermore, we run experiments to verify our proposed algorithm's effectiveness in different settings.

  • Multi-Agent Reinforcement Learning for Cooperative Task Offloading in Distributed Edge Cloud Computing

    Shiyao DING  Donghui LIN  

     
    PAPER

      Pubricized:
    2021/12/28
      Vol:
    E105-D No:5
      Page(s):
    936-945

    Distributed edge cloud computing is an important computation infrastructure for Internet of Things (IoT) and its task offloading problem has attracted much attention recently. Most existing work on task offloading in distributed edge cloud computing usually assumes that each self-interested user owns one edge server and chooses whether to execute its tasks locally or to offload the tasks to cloud servers. The goal of each edge server is to maximize its own interest like low delay cost, which corresponds to a non-cooperative setting. However, with the strong development of smart IoT communities such as smart hospital and smart factory, all edge and cloud servers can belong to one organization like a technology company. This corresponds to a cooperative setting where the goal of the organization is to maximize the team interest in the overall edge cloud computing system. In this paper, we consider a new problem called cooperative task offloading where all edge servers try to cooperate to make the entire edge cloud computing system achieve good performance such as low delay cost and low energy cost. However, this problem is hard to solve due to two issues: 1) each edge server status dynamically changes and task arrival is uncertain; 2) each edge server can observe only its own status, which makes it hard to optimize team interest as global information is unavailable. For solving these issues, we formulate the problem as a decentralized partially observable Markov decision process (Dec-POMDP) which can well handle the dynamic features under partial observations. Then, we apply a multi-agent reinforcement learning algorithm called value decomposition network (VDN) and propose a VDN-based task offloading algorithm (VDN-TO) to solve the problem. Specifically, the motivation is that we use a team value function to evaluate the team interest, which is then divided into individual value functions for each edge server. Then, each edge server updates its individual value function in the direction that can maximize the team interest. Finally, we choose a part of a real dataset to evaluate our algorithm and the results show the effectiveness of our algorithm in a comparison with some other existing methods.

  • SDM4IIoT: An SDN-Based Multicast Algorithm for Industrial Internet of Things

    Hequn LI  Jiaxi LU  Jinfa WANG  Hai ZHAO  Jiuqiang XU  Xingchi CHEN  

     
    PAPER-Network

      Pubricized:
    2021/11/11
      Vol:
    E105-B No:5
      Page(s):
    545-556

    Real-time and scalable multicast services are of paramount importance to Industrial Internet of Things (IIoT) applications. To realize these services, the multicast algorithm should, on the one hand, ensure the maximum delay of a multicast session not exceeding its upper delay bound. On the other hand, the algorithm should minimize session costs. As an emerging networking paradigm, Software-defined Networking (SDN) can provide a global view of the network to multicast algorithms, thereby bringing new opportunities for realizing the desired multicast services in IIoT environments. Unfortunately, existing SDN-based multicast (SDM) algorithms cannot meet the real-time and scalable requirements simultaneously. Therefore, in this paper, we focus on SDM algorithm design for IIoT environments. To be specific, the paper first converts the multicast tree construction problem for SDM in IIoT environments into a delay-bounded least-cost shared tree problem and proves that it is an NP-complete problem. Then, the paper puts forward a shared tree (ST) algorithm called SDM4IIoT to compute suboptimal solutions to the problem. The algorithm consists of five steps: 1) construct a delay-optimal shared tree; 2) divide the tree into a set of subpaths and a subtree; 3) optimize the cost of each subpath by relaxing the delay constraint; 4) optimize the subtree cost in the same manner; 5) recombine them into a shared tree. Simulation results show that the algorithm can provide real-time support that other ST algorithms cannot. In addition, it can achieve good scalability. Its cost is only 20.56% higher than the cost-optimal ST algorithm. Furthermore, its computation time is also acceptable. The algorithm can help to realize real-time and scalable multicast services for IIoT applications.

  • Performance Evaluation of Classification and Verification with Quadrant IQ Transition Image

    Hiro TAMURA  Kiyoshi YANAGISAWA  Atsushi SHIRANE  Kenichi OKADA  

     
    PAPER-Network Management/Operation

      Pubricized:
    2021/12/01
      Vol:
    E105-B No:5
      Page(s):
    580-587

    This paper presents a physical layer wireless device identification method that uses a convolutional neural network (CNN) operating on a quadrant IQ transition image. This work introduces classification and detection tasks in one process. The proposed method can identify IoT wireless devices by exploiting their RF fingerprints, a technology to identify wireless devices by using unique variations in analog signals. We propose a quadrant IQ image technique to reduce the size of CNN while maintaining accuracy. The CNN utilizes the IQ transition image, which image processing cut out into four-part. An over-the-air experiment is performed on six Zigbee wireless devices to confirm the proposed identification method's validity. The measurement results demonstrate that the proposed method can achieve 99% accuracy with the light-weight CNN model with 36,500 weight parameters in serial use and 146,000 in parallel use. Furthermore, the proposed threshold algorithm can verify the authenticity using one classifier and achieved 80% accuracy for further secured wireless communication. This work also introduces the identification of expanded signals with SNR between 10 to 30dB. As a result, at SNR values above 20dB, the proposals achieve classification and detection accuracies of 87% and 80%, respectively.

  • RF Signal Frequency Identification in a Direct RF Undersampling Multi-Band Real-Time Spectrum Monitor for Wireless IoT Usage

    Tomoyuki FURUICHI  Mizuki MOTOYOSHI  Suguru KAMEDA  Takashi SHIBA  Noriharu SUEMATSU  

     
    PAPER-Software Defined Radio

      Pubricized:
    2021/10/12
      Vol:
    E105-B No:4
      Page(s):
    461-471

    To reduce the complexity of direct radio frequency (RF) undersampling real-time spectrum monitoring in wireless Internet of Things (IoT) bands (920MHz, 2.4GHz, and 5 GHz bands), a design method of sampling frequencies is proposed in this paper. The Direct RF Undersampling receiver architecture enables the use of ADC with sampling clock lower frequency than receiving RF signal, but it needs RF signal identification signal processing from folded spectrums with multiple sampling clock frequencies. The proposed design method allows fewer sampling frequencies to be used than the conventional design method for continuous frequency range (D.C. to 5GHz-band). The proposed method reduced 2 sampling frequencies in wireless IoT bands case compared with the continuous range. The design result using the proposed method is verified by measurement.

  • Scaling Law of Energy Efficiency in Intelligent Reflecting Surface Enabled Internet of Things Networks

    Juan ZHAO  Wei-Ping ZHU  

     
    LETTER-Communication Theory and Signals

      Pubricized:
    2021/09/29
      Vol:
    E105-A No:4
      Page(s):
    739-742

    The energy efficiency of intelligent reflecting surface (IRS) enabled internet of things (IoT) networks is studied in this letter. The energy efficiency is mathematically expressed, respectively, as the number of reflecting elements and the spectral efficiency of the network and is shown to scale in the logarithm of the reflecting elements number in the high regime of transmit power from source node. Furthermore, it is revealed that the energy efficiency scales linearly over the spectral efficiency in the high regime of transmit power, in contrast to conventional studies on energy and spectral efficiency trade-offs in the non-IRS wireless IoT networks. Numerical simulations are carried out to verify the derived results for the IRS enabled IoT networks.

  • Efficient Task Allocation Protocol for a Hybrid-Hierarchical Spatial-Aerial-Terrestrial Edge-Centric IoT Architecture Open Access

    Abbas JAMALIPOUR  Forough SHIRIN ABKENAR  

     
    INVITED PAPER

      Pubricized:
    2021/08/17
      Vol:
    E105-B No:2
      Page(s):
    116-130

    In this paper, we propose a novel Hybrid-Hierarchical spatial-aerial-Terrestrial Edge-Centric (H2TEC) for the space-air integrated Internet of Things (IoT) networks. (H2TEC) comprises unmanned aerial vehicles (UAVs) that act as mobile fog nodes to provide the required services for terminal nodes (TNs) in cooperation with the satellites. TNs in (H2TEC) offload their generated tasks to the UAVs for further processing. Due to the limited energy budget of TNs, a novel task allocation protocol, named TOP, is proposed to minimize the energy consumption of TNs while guaranteeing the outage probability and network reliability for which the transmission rate of TNs is optimized. TOP also takes advantage of the energy harvesting by which the low earth orbit satellites transfer energy to the UAVs when the remaining energy of the UAVs is below a predefined threshold. To this end, the harvested power of the UAVs is optimized alongside the corresponding harvesting time so that the UAVs can improve the network throughput via processing more bits. Numerical results reveal that TOP outperforms the baseline method in critical situations that more power is required to process the task. It is also found that even in such situations, the energy harvesting mechanism provided in the TOP yields a more efficient network throughput.

  • Fusion of Blockchain, IoT and Artificial Intelligence - A Survey

    Srinivas KOPPU  Kumar K  Siva Rama KRISHNAN SOMAYAJI  Iyapparaja MEENAKSHISUNDARAM  Weizheng WANG  Chunhua SU  

     
    SURVEY PAPER

      Pubricized:
    2021/09/28
      Vol:
    E105-D No:2
      Page(s):
    300-308

    Blockchain is one of the prominent rapidly used technology in the last decade in various applications. In recent years, many researchers explored the capabilities of blockchain in smart IoT to address various security challenges. Integration of IoT and blockchain solves the security problems but scalability still remains a huge challenge. To address this, various AI techniques can be applied in the blockchain IoT framework, thus providing an efficient information system. In this survey, various works pertaining to the domains which integrate AI, IoT and Blockchain has been explored. Also, this article discusses potential industrial use cases on fusion of blockchain, AI and IoT applications and its challenges.

  • Highly Reliable Radio Access Scheme by Duplicate Transmissions via Multiple Frequency Channels and Suppressed Useless Transmission under Interference from Other Systems

    Hideya SO  Takafumi FUJITA  Kento YOSHIZAWA  Maiko NAYA  Takashi SHIMIZU  

     
    PAPER-Terrestrial Wireless Communication/Broadcasting Technologies

      Pubricized:
    2020/12/04
      Vol:
    E104-B No:6
      Page(s):
    696-704

    This paper proposes a novel radio access scheme that uses duplicated transmission via multiple frequency channels to achieve mission critical Internet of Things (IoT) services requiring highly reliable wireless communications; the interference constraints that yield the required reliability are revealed. To achieve mission critical IoT services by wireless communication, it is necessary to improve reliability in addition to satisfying the required transmission delay time. Reliability is defined as the packet arrival rate without exceeding the desired transmission delay time. Traffic of the own system and interference from the other systems using the same frequency channel such as unlicensed bands degrades the reliability. One solution is the frequency/time diversity technique. However, these techniques may not achieve the required reliability because of the time taken to achieve the correct reception. This paper proposes a novel scheme that transmits duplicate packets utilizing multiple wireless interfaces over multiple frequency channels. It also proposes a suppressed duplicate transmission (SDT) scheme, which prevents the wastage of radio resources. The proposed scheme achieves the same reliable performance as the conventional scheme but has higher tolerance against interference than retransmission. We evaluate the relationship between the reliability and the occupation time ratio where the interference occupation time ratio is defined as the usage ratio of the frequency resources occupied by the other systems. We reveal the upper bound of the interference occupation time ratio for each frequency channel, which is needed if channel selection control is to achieve the required reliability.

  • Transmission Control Method for Data Retention Taking into Account the Low Vehicle Density Environments

    Ichiro GOTO  Daiki NOBAYASHI  Kazuya TSUKAMOTO  Takeshi IKENAGA  Myung LEE  

     
    LETTER-Information Network

      Pubricized:
    2021/01/05
      Vol:
    E104-D No:4
      Page(s):
    508-512

    With the development and spread of Internet of Things (IoT) technology, various kinds of data are now being generated from IoT devices. Some data generated from IoT devices depend on geographical location and time, and we refer to them as spatio-temporal data (STD). Since the “locally produced and consumed” paradigm of STD use is effective for location-dependent applications, the authors have previously proposed a vehicle-based STD retention system. However, in low vehicle density environments, the data retention becomes difficult due to the decrease in the number of data transmissions in this method. In this paper, we propose a new data transmission control method for data retention in the low vehicle density environments.

1-20hit(57hit)